Develop and maintain scalable data pipelines and ETL processes to support reinsurance processing, actuarial analysis, and reporting.
Build automated data pipelines with tools like Airflow and MLflow that serve advanced statistical models and track key metrics.
Assess new data sources and techniques.
Mentor junior engineers and collaborate with various teams.
Consult with cross-functional team partners to understand business needs and develop reliable and effective solutions that are easily maintainable.
Use Enterprise GitHub for version control, documentation, code collaboration, and technical project management.
Regularly engage in continuing education, including the development of MLOps and Cloud Engineering skills as well as Insurance Business proficiency.
Present developments in data and analytics forums.
Requirements
Candidates must be authorized to work in the US without company sponsorship
Bachelors degree and 5+ years of experience with Informatica data pipelines and automation
Knowledge of Ceded Reinsurance business is a must
Experience in on-premises to cloud migration of data pipelines, modernizing them to leverage full cloud potential
Proficient with Glue Job on AWS
Experience using Gen AI solutions to automate data engineering pipelines preferred
Proficient in at least one programming language, such as Java or Python
Strong expertise in Systems Architecture
5+ years of deep practical experience in the role, with repeated experience performing business analysis in a variety of complex situations
Working knowledge of the ProCede application suite or similar Ceded Reinsurance system is a big plus
Experience in the conversion/migration of Policy/Claim Admin into ProCede or similar Ceded Reinsurance system
Ability to effectively interact with all levels of the organization (from entry-level end-users to senior management) and demonstrate strong leadership skills
Experience working in multi-source offshore delivery centers and shared services environments is a plus
Experience in Agile development, including prioritizing, creating/updating features and stories, and Test-Driven Development
Able to communicate effectively with both technical and non-technical teams
Data Engineer developing architecture and pipelines for data analytics at NinjaTrader. Empowering analysts and improving business workflows through data - driven solutions.
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.