Own the Data Engineering architecture and roadmap. Build scalable ELT pipelines, maintain cloud data infrastructure, and ensure data quality, observability, and delivery to critical systems.
Responsibilities
Own the Data Engineering architecture and its roadmap
Design, build, and maintain reliable, scalable ELT pipelines
Model data (using dbt or equivalent) to meet analytical and operational needs
Centralize and expose data in the data warehouse for the entire company
Feed critical systems via reverse ELT pipelines (CRM, ERP, operational tools)
Ensure data quality, freshness, and observability
Maintain and evolve cloud data infrastructure (performance, cost, security)
Implement and uphold engineering best practices (CI/CD, testing, IaC, monitoring)
Be proactive in identifying technical debt and opportunities for improvement
Requirements
6+ years of experience in Data Engineering or data-focused Software Engineering, with strong proficiency in SQL and Python
Proven experience designing and operating scalable ELT pipelines
Strong expertise in data modeling (clear, modular models aligned with business needs)
Advanced experience with dbt and modern data warehouses (BigQuery, Snowflake, ClickHouse, etc.)
Significant experience working in cloud environments (AWS or GCP)
Comfortable in a solution-oriented environment where initiative is central; high level of autonomy and accountability
Ability to make technical decisions and ensure their successful implementation
Product sense, pragmatism, and a business-oriented mindset to deliver high-value solutions
Fluent English required; French is an asset
Benefits
A competitive compensation package, including stock options, to share in our success
A structured career path with performance reviews every 6 months to support your development
A modern, renovated, bright, and collaborative workspace in the heart of Mile-End
Flexible hybrid work policy, with the possibility to work up to one month per year from abroad
Contribution toward your STM monthly transit pass
4 weeks of vacation and 5 personal days
Additional days off for important life events (moving, birth, marriage, etc.)
Competitive health coverage for you and your family to support your well-being
Specialist, Data Engineering at CoverMyMeds enhancing and expanding data platforms for commercial data products. Collaborating with multiple teams to design scalable data solutions from various sources.
Team Lead in Data Engineering at Avanquest mentoring data engineering team and ensuring efficient data management across platforms. Collaborating with departments to align solutions and optimize workflows.
Data Architect at RSM leading AI - driven data migration initiatives within Salesforce ecosystem. Implementing data governance and optimizing performance across complex datasets.
Senior Data Engineer at Capgemini designing and optimizing scalable data architectures on Databricks and GCP. Collaborating across teams to transform business needs into reliable technical solutions.
Data Engineer transforming legacy on - premises systems to cloud - native architectures for advanced data analytics. Collaborating with teams to build efficient data solutions using Python and AWS.
Data Engineering Academy focused on Snowflake and Databricks for professionals interested in expanding their technical capabilities. Fully remote with future office work in Monterrey or Saltillo after completion.
Senior Data Engineer at Intent HQ designing and scaling data platforms. Building high - impact intelligence from millions of customer insights with a focus on performance and reliability.
SAP Data Engineer supporting MERKUR GROUP's evolution into a data - driven company. Responsible for data integration, modeling, and collaboration with various departments in Group Finance.
Data Engineer at Booz Allen Hamilton organizing data and developing advanced technology solutions. Leading data engineering activities for mission - driven projects and mentoring multidisciplinary teams.
Senior Data Engineer at Bristol Myers Squibb developing scalable data pipelines for foundational products. Collaborating with data scientists and IT professionals to ensure data quality and accessibility.