Lead the design, development, and maintenance of scalable data pipelines using Apache Airflow, dbt, and Azure Data Factory.
Architect and optimise our Snowflake data warehouse, including schema design, performance tuning, and cost management.
Implement and enforce data governance policies, including data masking, role-based access control, and compliance with regulatory standards.
Drive MLOps initiatives: model deployment, monitoring, retraining pipelines, and integration with CI/CD workflows.
Collaborate with cross-functional teams to operationalise ML models, including Generative AI solutions, into production systems.
Partner with the Platforms team to oversee infrastructure-as-code deployments using Pulumi, Docker, Kubernetes and cloud-native services (Azure, AWS).
Lead, coach and guide a team of engineers, fostering a culture of best practices in coding, testing, and documentation alongside a culture of continuous learning and psychological safety.
Partner with stakeholders across the business such as Operations to translate business requirements into technical solutions.
Requirements
7+ years experience in data engineering, with at least 2 years experience leading a team, ideally in a fast paced, high growth environment.
Strong proficiency in Apache Airflow, dbt, and Azure Data Factory.
Expertise in Snowflake, including advanced SQL, performance optimisation, and security features.
Experience with real-time data processing (Apache Flink, Kafka).
Proven experience with MLOps tools and frameworks (MLflow, Kubeflow, or similar).
Hands-on experience deploying and integrating Generative AI models (OpenAI, Hugging Face, or similar).
Solid understanding of data governance frameworks and compliance requirements (GDPR, CCPA).
Proficiency in Python and SQL; familiarity with PySpark or other distributed processing frameworks is a plus.
Experience with cloud platforms (Azure, AWS) and infrastructure-as-code tools (Pulumi, Terraform).
Strong CI/CD skills with Azure Pipelines or equivalent.
Excellent communication skills, with the ability to interact effectively with technical and non-technical stakeholders
Strong problem solving skills, with a continuous learning mindset.
Benefits
Flexible and hybrid working
$500 every year to spend on your wellbeing
Take an extra Annual Leave day off on us every financial year, with A Day on Wisr
Access via WHEREFIT to discounted gym memberships, corporate discounts for wellbeing products and more!
Generous paid parental leave to support your transition to parenthood
Regular social events and awesome team offsites
Access to our Employee Assistance Program, Uprise with up to 6 coaching sessions per year
Data Engineer developing architecture and pipelines for data analytics at NinjaTrader. Empowering analysts and improving business workflows through data - driven solutions.
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.