Data Engineering Lead designing and optimizing scalable data pipelines for a property insurance company. Collaborating across teams, mentoring engineers, and driving data architecture initiatives.
Responsibilities
Build and optimize distributed data processing jobs using Apache Spark on Databricks.
Implement Delta Lake, DLT pipelines, dbt transformations and Medallion architecture for scalable and reliable data workflows.
Design and automate ETL pipelines using Azure Data Factory, Databricks, and Synapse Analytics.
Integrate data from diverse sources including Duck Creek, Intacct, Workday and external APIs.
Develop dimensional models (Star/Snowflake schemas), stored procedures, and views for data warehouses.
Ensure efficient querying and transformation using SQL, T-SQL, and PySpark.
Leverage Azure DevOps, CI/CD pipelines, and GitHub for version control and deployment.
Utilize Azure Logic Apps and ML Flow for workflow automation and model training.
Implement role-based access control (RBAC), data encryption, and auditing mechanisms.
Work closely with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
Mentor junior engineers and contribute to code reviews and architectural decisions.
Requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
5+ years of experience in data engineering with at least 2 years on Databricks.
Proficiency in Python, Scala, SQL, and Spark.
Hands-on experience with Azure Data Services (ADF, ADLS, Synapse).
Strong understanding of ETL, data warehousing, and data modeling concepts.
Experience with Microstrategy, Power BI, including DAX and advanced visualizations.
Familiarity with MLflow, LangChain, and LLM integration is a plus.
Knowledge of Duck Creek a plus.
Insurance Domain knowledge preferred.
Preferred Certifications Databricks Data Engineering Professional Azure/AWS Data Engineering Certifications
Benefits
Opportunities to stretch and grow: your professional and personal development matters to us.
Clarity and kindness: you can rely on us to be open, honest and supportive, offering clarity on what success looks like.
Support in good times and bad: we believe in showing up for each other consistently, not only when it’s easy.
A community that cares: we are committed to sustaining a community in which each person feels cared for as an individual.
Senior Data Engineer at Keyrus leading the design, development, and delivery of scalable data platforms. Collaborating with teams to translate requirements into production - grade solutions and mentoring engineers.
Senior Data Engineer for global payments platform designing ETL pipelines and data models. Collaborating across teams to tackle complex data challenges in an innovative fintech environment.
Data Warehouse Modelling Engineer designing and maintaining data models using Data Vault 2.0 for iGaming industry. Collaborating with stakeholders and optimizing data models in a hybrid work environment.
Senior Data Engineer driving impactful data solutions for the climate logistics startup HIVED's core data platform. Collaborating with cross - functional squads to enhance analytics and delivery.
Data Engineer developing and maintaining CRE forecasting infrastructure for Cushman & Wakefield. Collaborates with senior economists and technical teams to ensure high - quality data solutions.
Data Engineer at PwC, engaging with Azure cloud services to enhance data handling and integrity. Responsibilities include pipeline optimizations, documentation, and collaboration with stakeholders.
Data Engineer Manager at PwC focusing on building data infrastructure and solutions. Leading data engineering projects to transform raw data into actionable insights and drive business growth.