Sr Data Engineer leading data engineering with cloud migration and Ceded Reinsurance integration at The Hartford. Mentoring junior engineers while developing scalable data pipelines and strategies.
Responsibilities
Develop and maintain scalable data pipelines and ETL processes to support reinsurance processing, actuarial analysis, and reporting.
Build automated data pipelines with tools like Airflow and MLflow that serve advanced statistical models and track key metrics.
Assess new data sources and techniques.
Mentor junior engineers and collaborate with various teams.
Consult with cross-functional team partners to understand business needs and develop reliable and effective solutions that are easily maintainable.
Use Enterprise GitHub for version control, documentation, code collaboration, and technical project management.
Regularly engage in continuing education, including the development of MLOps and Cloud Engineering skills as well as Insurance Business proficiency.
Present developments in data and analytics forums.
Requirements
Candidates must be authorized to work in the US without company sponsorship
Bachelors degree and 5+ years of experience with Informatica data pipelines and automation
Knowledge of Ceded Reinsurance business is a must
Experience in on-premises to cloud migration of data pipelines, modernizing them to leverage full cloud potential
Proficient with Glue Job on AWS
Experience using Gen AI solutions to automate data engineering pipelines preferred
Proficient in at least one programming language, such as Java or Python
Strong expertise in Systems Architecture
5+ years of deep practical experience in the role, with repeated experience performing business analysis in a variety of complex situations
Working knowledge of the ProCede application suite or similar Ceded Reinsurance system is a big plus
Experience in the conversion/migration of Policy/Claim Admin into ProCede or similar Ceded Reinsurance system
Ability to effectively interact with all levels of the organization (from entry-level end-users to senior management) and demonstrate strong leadership skills
Experience working in multi-source offshore delivery centers and shared services environments is a plus
Experience in Agile development, including prioritizing, creating/updating features and stories, and Test-Driven Development
Able to communicate effectively with both technical and non-technical teams
Senior Data Engineer at Keyrus focusing on data solutions and projects to drive performance. Collaborating with teams globally to enhance data transformation and governance processes.
Data Engineer developing scalable data pipelines for ETL/ELT processes using GCP services. Collaborating with team members to optimize data workflows and ensure data integrity.
Data Governance Engineer in Fintech developing a formal cyber data governance framework. Collaborating with cyber security, analytics, and platform engineering teams on metadata and lineage capabilities.
Junior Data Engineer role at Allegro, focusing on developing ETL/ELT pipelines and processing large datasets. Collaborate with cross - functional teams for data quality and reporting.
Data Engineer at Concept Reply developing innovative data - driven solutions in IoT. Collaborating with teams to unlock the potential of data and cloud computing.
Data Engineer creating and managing data pipelines for critical data solutions at S&P Global. Collaborating on enterprise - scale data processing in a supportive, innovative environment.
Data Engineer supporting and evolving data environment in cloud migration. Maintain and optimize existing databases while designing modern data solutions with cross - functional collaboration.
Senior Data Engineer responsible for data pipeline projects at Suprema Gaming. Focus on batch and streaming data solutions while collaborating with business teams.
Senior data leader managing the enterprise data architecture at Breakthru Beverage. Leading high - performing teams in data engineering and defining modern data strategies.
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.