Data Architect leading the design of a Customer Data Mart at ShyftLabs for Fortune 500 AI solutions. Collaborating with teams to implement scalable, secure, and modern data architectures.
Responsibilities
Own the technical vision and architecture for the Unified Customer Data Mart, ensuring solutions are scalable, secure, compliant, and aligned with enterprise standards.
Design and implement end-to-end data architectures of data pipelines, including raw data ingestion (Bronze), data cleaning and standardization (Silver), and curated data marts (Gold) that serve CDP, reporting, and activation use cases.
Define and evolve data modeling standards for customer data, including customer dimensions, transaction facts, engagement events, web behavior, support interactions, and loyalty activity.
Decomposing complex business requirements into structured technical solutions and driving alignment with client stakeholders.
Formulate, compare, and present multiple architectural approaches for data ingestion, transformation, identity resolution, and consumption patterns, guiding clients and internal teams toward optimal long-term solutions that balance speed, maintainability, and scalability.
Architect and build production-grade data pipelines using DBT and Airflow that support customer analytics, segmentation and reporting at scale.
Partner directly with client stakeholders to understand business objectives, translate customer journey requirements into robust technical designs, and act as a trusted technical advisor on data architecture decisions.
Lead and mentor cross-functional teams, including Analytics Engineers, Data Engineers, and BI developers, setting a high bar for technical quality, code review standards, and documentation practices.
Influence and contribute to data governance initiatives, including PII handling, data quality frameworks, identity resolution strategies, and platform reliability.
Requirements
Deep expertise in SQL and Python, with demonstrated ability to design, optimize, and troubleshoot complex distributed data systems.
5+ years of experience in data engineering and/or data architecture, with a proven track record of building and scaling enterprise-level data platforms.
Extensive experience designing and implementing data lakes, cloud data warehouses, and modern analytics architectures in production environments.
Hands on experience with DBT for transformations and modular data modeling.
Hands on experience with Google BigQuery (mandatory) or equivalent cloud warehouses (Snowflake, Databricks).
Hands on experience with Airflow (or similar orchestration frameworks).
Proven experience implementing medallion or layered data architectures, including raw ingestion, conformed layers, and curated marts.
Strong foundation in dimensional modeling, star/snowflake schemas, conformed dimensions, and designing for both analytical and operational use cases.
Experience with Customer Data Platforms (CDPs) and multi-channel customer data integration, including identity resolution (deterministic and probabilistic matching).
Experience designing for security and compliance, including PII masking, access controls, RLS policies, encryption, and privacy regulations (GDPR/CCPA).
Strong understanding of cloud architecture principles, including storage optimization, cost management, security patterns, and scalability in GCP environments.
Demonstrated ability to operate independently with full architectural ownership while influencing senior stakeholders in client-facing environments.
Experience leading and mentoring engineers, setting architectural standards, and driving technical governance.
Benefits
Comprehensive Benefits: 100% coverage for health, dental, and vision insurance for you and your dependents from day one.
Hybrid Flexibility: 4 days per week in our downtown Toronto office.
Continuous learning opportunities and influence over technical direction.
Shape applied research and AI strategy in a fast-growing, product-focused data company.
Cloud Data Engineer implementing tailored solutions for Volkswagen Group data processing. Building ETL/ELT pipelines while collaborating with technical experts.
Data Engineer designing and optimizing data pipelines using Databricks and Google Cloud Platform. Collaborating with analysts and scientists to deliver high - quality data products.
Data Engineer responsible for building scalable data infrastructure that supports data - driven decisions. Collaborating with team to maintain systems and unlock data value for organizations.
Associate Data Engineer supporting privacy engineering controls and executing privacy impact assessments in a financial services company. Collaborating across business units to ensure alignment with privacy regulations.
Senior Data Engineer at CVS Health developing robust data pipelines for healthcare data. Collaborating with teams to provide actionable insights and integrate them with consumer touchpoints.
Data Engineer at CVS Health optimizing data pipelines and analytical models. Driving data - driven decisions with healthcare data for improved business outcomes.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.