**Architect and Evolve Our Core Data Platform** You will own the technical vision and roadmap for our data platform, steering its evolution on our modern cloud stack and ensuring it meets the demands of a rapidly scaling business.
**Own the Architecture:** Design, implement, and refine a robust data lakehouse architecture (e.g., Medallion) using Databricks and Delta Lake to ensure data reliability and performance.
**Build Scalable Ingestion Frameworks:** Develop and maintain resilient, reusable patterns for ingesting data from a diverse set of sources, including our systems, transactional databases, event streams, and third-party SaaS APIs.
**Define Data Modelling Standards:** Lead the implementation of our core data modelling principles (e.g., Kimball dimensional modelling) to produce curated, intuitive datasets for business intelligence and product analytics.
**Implement Robust Governance:** Use tools like Unity Catalog to establish a comprehensive data governance framework, covering data lineage, fine-grained access controls, and a user-friendly data catalogue.
**Manage Platform Performance and Cost:** Develop and implement strategies for monitoring, optimising, and forecasting our Databricks and cloud expenditure, ensuring the platform is both powerful and cost-effective.
**Champion Engineering Excellence and Best Practice** You will be the driving force for maturing our data operations, embedding a culture of quality, automation, and reliability into everything we do.
**Automate Everything with CI/CD:** Implement and advocate for automated CI/CD pipelines (e.g., using GitHub Actions) for all data assets, including dbt models, infrastructure changes, and Databricks jobs.
**Embed Git-Based Workflows:** Champion a Git-first culture for all data transformation code, establishing clear processes for branching, code reviews, and version control.
**Embed Automated Data Quality: **Implement comprehensive, automated data quality testing at every stage of our pipelines using tools like dbt test, ensuring data is accurate and trustworthy.
**Introduce Data Observability:** Establish thorough monitoring, logging, and alerting for all data pipelines to proactively detect, diagnose, and resolve issues before they impact the business**
Requirements
Mastery of data architecture principles, data modelling frameworks (e.g., dimensional modelling), and a strong understanding of data governance and security best practices.
A strong software engineering mindset, with significant experience implementing CI/CD for data, Git-based workflows, and automated data quality testing.
Exceptional communication and stakeholder management skills, with a proven ability to translate complex technical concepts for non-technical audiences and influence business decisions.
A genuine passion for leadership and mentorship, with a track record of elevating the technical skills of those around you.
**Tech Stack:**
Dbt
Databricks, Unity Catalog
Terraform
AWS: Redshift, Dynamo db, API gateway, Cloud Watch, Lambda, Streaming with Kenisis/Firehose, Glue, Bedrock
Stitch & Fivetran
Languages required include advanced SQL, python
Benefits
Enjoy a flexible remote-first work policy (with a work-from-home stipend to set you up for success!)
Own A piece of Deputy via our Employee Share Ownership Plan (ESOP)
Take paid parental leave to support you and your family
Stay protected with Group Salary Continuance Insurance
Access support through our Employee Assistance Program
Enjoy additional leave days — including study assistance, celebration days and volunteering
Join our global working groups focused on collaboration, belonging and connection
Get creative at our annual Hackathons
Take advantage of our novated leasing for electric vehicles, internet reimbursement and more!
Risk Data Engineer and Architect at Lincoln Financial supporting risk analytics through AWS data solutions. Building scalable data pipelines and collaborating with cross - functional teams.
Senior Data Engineer designing secure and scalable data systems for maritime and defense applications. Seeking experienced professional with strong expertise in AWS and Azure environments.
Data Engineer managing payment processing and data accuracy while collaborating with financial teams. Building and optimizing data pipelines for transactional data in a hybrid work environment.
Data Engineer building analytical tools for Dry Bulk market data operations at Kpler. Join a team of over 700 experts transforming data into actionable strategies.
Data Engineer developing tools for maintaining data integrity in cargo tracking at Kpler. Collaborating with analysts and engineers to enhance data quality management.
Lead Azure Data Engineer designing and optimizing data ecosystems on Microsoft Cloud. Responsible for building scalable data platforms and pipelines for analytics and reporting.
Data Engineer providing support for IBM DataStage ETL jobs at Callibrity. Collaborating with stakeholders and working to modernize technology solutions in a hybrid work environment.