Senior Data Architect focused on designing standardized, scalable data models for the insurance sector. Working on data modernization initiatives across P&C and L&A business units.
Responsibilities
Design and maintain enterprise data models using Data Vault 2.0, dimensional modeling (Kimball), and normalized approaches (Inmon).
Develop modular, reusable model components that can be adapted across multiple business entities.
Build curated datasets and support the development of data products for business consumption.
Work with engineering teams to implement models on Azure and Databricks platforms.
Align data model designs with ETL/ELT pipelines, ingestion frameworks, and BI/reporting tools.
Ensure models support performance, scalability, and data quality requirements.
Establish and maintain modeling standards, documentation guidelines, metadata structures, and lineage requirements.
Promote consistency and adherence to data governance practices across teams.
Work closely with business stakeholders to translate requirements into data model designs.
Provide guidance and mentorship to junior data modelers.
Serve as a subject matter expert for data modeling methodologies and modern data architecture practices.
Requirements
10–12 years of experience in data architecture, data modeling, or data warehousing.
Strong expertise in Data Vault 2.0, including hub, link, and satellite design.
Hands-on experience with Azure (ADF, Synapse, ADLS) and Databricks.
Deep understanding of Kimball and Inmon modeling techniques.
Proficiency in SQL and ELT/ETL concepts.
Experience with metadata management and modeling documentation.
Insurance industry experience in P&C and/or L&A (policy, claims, underwriting, billing, actuarial, etc.).
Strong communication and collaboration skills; able to guide junior team members.
Experience designing reusable data model templates or frameworks.
Background supporting data product development or semantic layer implementation.
Familiarity with data governance frameworks and enterprise data management practices.
Benefits
EXL never requires or asks for fees/payments or credit card or bank details during any phase of the recruitment or hiring process and has not authorized any agencies or partners to collect any fee or payment from prospective candidates. EXL will only extend a job offer after a candidate has gone through a formal interview process with members of EXL’s Human Resources team, as well as our hiring managers.
Senior Data Engineer at A Kube Inc responsible for building and maintaining data pipelines for product performance. Collaborating with product, engineering, and analytics teams to ensure data quality and efficiency.
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.