**Architect and Evolve Our Core Data Platform** You will own the technical vision and roadmap for our data platform, steering its evolution on our modern cloud stack and ensuring it meets the demands of a rapidly scaling business.
**Own the Architecture:** Design, implement, and refine a robust data lakehouse architecture (e.g., Medallion) using Databricks and Delta Lake to ensure data reliability and performance.
**Build Scalable Ingestion Frameworks:** Develop and maintain resilient, reusable patterns for ingesting data from a diverse set of sources, including our systems, transactional databases, event streams, and third-party SaaS APIs.
**Define Data Modelling Standards:** Lead the implementation of our core data modelling principles (e.g., Kimball dimensional modelling) to produce curated, intuitive datasets for business intelligence and product analytics.
**Implement Robust Governance:** Use tools like Unity Catalog to establish a comprehensive data governance framework, covering data lineage, fine-grained access controls, and a user-friendly data catalogue.
**Manage Platform Performance and Cost:** Develop and implement strategies for monitoring, optimising, and forecasting our Databricks and cloud expenditure, ensuring the platform is both powerful and cost-effective.
**Champion Engineering Excellence and Best Practice** You will be the driving force for maturing our data operations, embedding a culture of quality, automation, and reliability into everything we do.
**Automate Everything with CI/CD:** Implement and advocate for automated CI/CD pipelines (e.g., using GitHub Actions) for all data assets, including dbt models, infrastructure changes, and Databricks jobs.
**Embed Git-Based Workflows:** Champion a Git-first culture for all data transformation code, establishing clear processes for branching, code reviews, and version control.
**Embed Automated Data Quality: **Implement comprehensive, automated data quality testing at every stage of our pipelines using tools like dbt test, ensuring data is accurate and trustworthy.
**Introduce Data Observability:** Establish thorough monitoring, logging, and alerting for all data pipelines to proactively detect, diagnose, and resolve issues before they impact the business**
Requirements
Mastery of data architecture principles, data modelling frameworks (e.g., dimensional modelling), and a strong understanding of data governance and security best practices.
A strong software engineering mindset, with significant experience implementing CI/CD for data, Git-based workflows, and automated data quality testing.
Exceptional communication and stakeholder management skills, with a proven ability to translate complex technical concepts for non-technical audiences and influence business decisions.
A genuine passion for leadership and mentorship, with a track record of elevating the technical skills of those around you.
**Tech Stack:**
Dbt
Databricks, Unity Catalog
Terraform
AWS: Redshift, Dynamo db, API gateway, Cloud Watch, Lambda, Streaming with Kenisis/Firehose, Glue, Bedrock
Stitch & Fivetran
Languages required include advanced SQL, python
Benefits
Enjoy a flexible remote-first work policy (with a work-from-home stipend to set you up for success!)
Own A piece of Deputy via our Employee Share Ownership Plan (ESOP)
Take paid parental leave to support you and your family
Stay protected with Group Salary Continuance Insurance
Access support through our Employee Assistance Program
Enjoy additional leave days — including study assistance, celebration days and volunteering
Join our global working groups focused on collaboration, belonging and connection
Get creative at our annual Hackathons
Take advantage of our novated leasing for electric vehicles, internet reimbursement and more!
Senior Data Engineer / Data Architect at Node.Digital focusing on designing data architectures and managing pipelines. Collaborating across teams to support enterprise application delivery.
Data Engineering Lead responsible for data pipeline design and optimization at Mars. Leading a talented team to drive impactful data solutions across North America.
Data Engineering Developer intern participating in secure data flow creation at Intact. Collaborating on data engineering using Python and cloud technologies for an enterprise data platform.
Data Engineer responsible for building and maintaining data transformation pipelines at OnePay. Collaborating across teams in a mission - driven fintech environment.
Senior Developer within Enterprise Data Management at LPL Financial. Responsible for supporting data management projects and collaborating with business partners and developers.
Lead Data Engineer at Capital One solving complex business problems with data and emerging technologies. Collaborating across Agile teams to deliver cloud - based technical solutions.
Join Luminor as a Mid/Senior Data Engineer focusing on data engineering within risk and finance reporting. Collaborate to design scalable data architectures and support regulatory requirements while enhancing data integration processes.
Join Luminor as a Mid/Senior Data Engineer focusing on data engineering within risk and finance reporting. Design and optimize data systems supporting evolving regulatory requirements in a dynamic banking environment.
Senior GCP Data Engineer designing, building, and optimizing data platforms on GCP. Collaborating with product teams to deliver high - performance data solutions.
Snowflake Data Engineer optimizing data pipelines using Snowflake for a global life science company. Collaborate with cross - functional teams for data solutions and performance improvements in Madrid.