Lead Data Engineer architecting and building data platforms at an ASX-listed fintech. Spearheading MLOps initiatives and collaborating across teams in a hybrid work environment.
Responsibilities
Lead the design, development, and maintenance of scalable data pipelines using Apache Airflow, dbt, and Azure Data Factory.
Architect and optimise our Snowflake data warehouse, including schema design, performance tuning, and cost management.
Implement and enforce data governance policies, including data masking, role-based access control, and compliance with regulatory standards.
Drive MLOps initiatives: model deployment, monitoring, retraining pipelines, and integration with CI/CD workflows.
Collaborate with cross-functional teams to operationalise ML models, including Generative AI solutions, into production systems.
Partner with the Platforms team to oversee infrastructure-as-code deployments using Pulumi, Docker, Kubernetes and cloud-native services (Azure, AWS).
Lead, coach and guide a team of engineers, fostering a culture of best practices in coding, testing, and documentation alongside a culture of continuous learning and psychological safety.
Partner with stakeholders across the business such as Operations to translate business requirements into technical solutions.
Requirements
7+ years experience in data engineering, with at least 2 years experience leading a team, ideally in a fast paced, high growth environment.
Strong proficiency in Apache Airflow, dbt, and Azure Data Factory.
Expertise in Snowflake, including advanced SQL, performance optimisation, and security features.
Experience with real-time data processing (Apache Flink, Kafka).
Proven experience with MLOps tools and frameworks (MLflow, Kubeflow, or similar).
Hands-on experience deploying and integrating Generative AI models (OpenAI, Hugging Face, or similar).
Solid understanding of data governance frameworks and compliance requirements (GDPR, CCPA).
Proficiency in Python and SQL; familiarity with PySpark or other distributed processing frameworks is a plus.
Experience with cloud platforms (Azure, AWS) and infrastructure-as-code tools (Pulumi, Terraform).
Strong CI/CD skills with Azure Pipelines or equivalent.
Excellent communication skills, with the ability to interact effectively with technical and non-technical stakeholders
Strong problem solving skills, with a continuous learning mindset.
Benefits
Flexible and hybrid working
$500 every year to spend on your wellbeing
Take an extra Annual Leave day off on us every financial year, with A Day on Wisr
Access via WHEREFIT to discounted gym memberships, corporate discounts for wellbeing products and more!
Generous paid parental leave to support your transition to parenthood
Regular social events and awesome team offsites
Access to our Employee Assistance Program, Uprise with up to 6 coaching sessions per year
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.