Data Engineering Lead designing and optimizing scalable data pipelines for a property insurance company. Collaborating across teams, mentoring engineers, and driving data architecture initiatives.
Responsibilities
Build and optimize distributed data processing jobs using Apache Spark on Databricks.
Implement Delta Lake, DLT pipelines, dbt transformations and Medallion architecture for scalable and reliable data workflows.
Design and automate ETL pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
Integrate data from diverse sources including Duck Creek, Intacct, Workday and external APIs.
Develop dimensional models (Star/Snowflake schemas), stored procedures, and views for data warehouses.
Ensure efficient querying and transformation using SQL, T-SQL, and PySpark.
Leverage Azure DevOps, CI/CD pipelines, and GitHub for version control and deployment.
Utilize Azure Logic Apps and ML Flow for workflow automation and model training.
Implement role-based access control (RBAC), data encryption, and auditing mechanisms.
Work closely with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
Mentor junior engineers and contribute to code reviews and architectural decisions.
Requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
5+ years of experience in data engineering with at least 2 years on Databricks.
Proficiency in Python, Scala, SQL, and Spark.
Hands-on experience with Azure Data Services (ADF, ADLS, Synapse).
Strong understanding of ETL, data warehousing, and data modeling concepts.
Experience with Microstrategy, Power BI, including DAX and advanced visualizations.
Familiarity with MLflow, LangChain, and LLM integration is a plus.
Knowledge of Duck Creek a plus.
Insurance Domain knowledge preferred.
Preferred Certifications Databricks Data Engineering Professional Azure/AWS Data Engineering Certifications
Benefits
Opportunities to stretch and grow: your professional and personal development matters to us.
Clarity and kindness: you can rely on us to be open, honest and supportive, offering clarity on what success looks like.
Support in good times and bad: we believe in showing up for each other consistently, not only when it’s easy.
A community that cares: we are committed to sustaining a community in which each person feels cared for as an individual.
Senior Advisor managing data architecture and modeling frameworks for iA Financial Group. Supporting transformation and innovation of data management practices within the organization.
Senior Data Engineer leading the design and implementation of enterprise data strategies for Kemper Insurance's analytics and business intelligence initiatives.
Data Engineer developing data products and warehouse solutions using cloud services and platforms while ensuring data governance and quality. Supporting tech team with technical architecture and implementation processes.
Senior Clinical Data Engineer at McKesson designing and implementing data solutions and analytics. Collaborating with clinical teams and technical experts for effective data management with scalable solutions.
Tax Technology Assistant Manager building a global tax data warehouse for Flutter, collaborating across teams. Utilizing SQL and Python for data transformation and optimizations in a hybrid role.
Staff Data Engineer for IntelliScript's Data Platform managing data strategy and governance. Lead projects ensuring data quality and compliance with industry standards.
Senior Data Engineer designing and implementing Data Platform solutions for healthcare enterprise at Milliman IntelliScript. Collaborating with cross - functional teams while ensuring data privacy compliance.
Data Warehouse Analyst ensuring business intelligence requirements are met. Collaborating with IT and business resources in Bengaluru, Hyderabad & Chennai.
Data Engineer developing and enhancing Azure - based data platform for cloud analytics. Engaging in collaboration for data - driven decisions in supply chain, logistics, and finance.
Data Engineer building scalable data platforms using Databricks and Azure for Reconomy. Collaborating across teams to deliver high - quality data solutions.