Data Warehouse Modelling Engineer designing and maintaining data models using Data Vault 2.0 for iGaming industry. Collaborating with stakeholders and optimizing data models in a hybrid work environment.
Responsibilities
Design and implement Data Vault 2.0 models (Raw Vault and Business Vault) including Hubs, Links, Satellites, Point-in-Time (PIT) tables, and Bridges.
Develop and maintain robust dbt projects for Data Vault implementation, leveraging macros, packages (e.g., datavault4dbt, AutomateDV/dbtvault), YAML configurations, and Jinja templating.
Build staging layers, raw vault loading patterns, and business vault transformations following Data Vault 2.0 standards and best practices.
Ensure high data quality, auditability, traceability, and historical tracking through proper hash key design, loading patterns, and effective dating.
Collaborate with data architects, data engineers, and business stakeholders to translate business requirements into scalable Data Vault models.
Optimize data models for performance.
Implement dbt best practices: tests (unit + schema), documentation, selectors, CI/CD pipelines, and version control (Git).
Support incremental loading strategies, handling of late-arriving data, and multi-source integration challenges.
Contribute to the evolution of the data modelling framework and automation standards.
Requirements
Strong hands-on experience (5+ years) in Data Vault 2.0 modelling on large-scale enterprise projects.
Extensive expertise in building and maintaining Data Vault implementations using dbt (dbt Core or dbt Cloud).
Deep understanding of Data Vault 2.0 components: Hubs, Links, Satellites (effectivity, multi-active, computed), PIT tables, Bridges, and Reference tables.
Solid SQL skills and experience with modern data warehouses (Trino preferred, ClickHouse a plus).
Advanced proficiency in dbt — models, macros, tests, documentation, exposures, and project structuring.
Experience with Data Vault automation packages (datavault4dbt, AutomateDV, or similar).
Strong knowledge of ELT/ETL patterns, data integration from multiple sources, and handling of changing business keys.
Git, CI/CD for data (e.g., dbt Cloud jobs, GitHub Actions), and Infrastructure as Code.
Familiarity with dimensional modelling (Kimball Star Schema) as a downstream consumption layer is a plus.
Excellent analytical and problem-solving skills with a strong focus on scalability and maintainability.
Ability to work in agile environments and deliver incrementally.
Strong communication skills — able to explain complex modelling concepts to both technical and non-technical stakeholders.
Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field (or equivalent experience).
Benefits
Health insurance for employees and close family members
Career growth opportunity
Training and professional development events
Teamwork and accountability
Sense of community and defined company culture
Gym reimbursement after successfully passing probationary period
Job title
Data Warehouse Modelling Engineer, Data Vault 2.0, dbt
Senior Data Engineer driving impactful data solutions for the climate logistics startup HIVED's core data platform. Collaborating with cross - functional squads to enhance analytics and delivery.
Data Engineer developing and maintaining CRE forecasting infrastructure for Cushman & Wakefield. Collaborates with senior economists and technical teams to ensure high - quality data solutions.
Data Engineer at PwC, engaging with Azure cloud services to enhance data handling and integrity. Responsibilities include pipeline optimizations, documentation, and collaboration with stakeholders.
Data Engineer Manager at PwC focusing on building data infrastructure and solutions. Leading data engineering projects to transform raw data into actionable insights and drive business growth.
Junior Data Engineer at OneMarketData focusing on data quality and integrity in financial datasets. Collaborating with senior analysts and assisting in data management and analysis tasks.
Senior Data Engineering Analyst developing and implementing data solutions. Collaborating in a diverse environment focused on data processing and analysis for clients' digital transformation.
Senior Azure Data Engineer maintaining business intelligence solutions for Grupo Gloria, implementing and stabilizing projects in Azure and Databricks with Power BI reporting.
Principal Software Engineer in Threat Data Platform developing AI - driven tools for threat intelligence automation. Collaborating on robust data pipelines for PANW’s product ecosystem.
Staff Data Engineer at URBN developing AI - powered digital experiences by integrating algorithmic solutions with creative tools. Collaborating with cross - functional teams for impactful product evolution.