Lead Data Engineer architecting and building data platforms at an ASX-listed fintech. Spearheading MLOps initiatives and collaborating across teams in a hybrid work environment.
Responsibilities
Lead the design, development, and maintenance of scalable data pipelines using Apache Airflow, dbt, and Azure Data Factory.
Architect and optimise our Snowflake data warehouse, including schema design, performance tuning, and cost management.
Implement and enforce data governance policies, including data masking, role-based access control, and compliance with regulatory standards.
Drive MLOps initiatives: model deployment, monitoring, retraining pipelines, and integration with CI/CD workflows.
Collaborate with cross-functional teams to operationalise ML models, including Generative AI solutions, into production systems.
Partner with the Platforms team to oversee infrastructure-as-code deployments using Pulumi, Docker, Kubernetes and cloud-native services (Azure, AWS).
Lead, coach and guide a team of engineers, fostering a culture of best practices in coding, testing, and documentation alongside a culture of continuous learning and psychological safety.
Partner with stakeholders across the business such as Operations to translate business requirements into technical solutions.
Requirements
7+ years experience in data engineering, with at least 2 years experience leading a team, ideally in a fast paced, high growth environment.
Strong proficiency in Apache Airflow, dbt, and Azure Data Factory.
Expertise in Snowflake, including advanced SQL, performance optimisation, and security features.
Experience with real-time data processing (Apache Flink, Kafka).
Proven experience with MLOps tools and frameworks (MLflow, Kubeflow, or similar).
Hands-on experience deploying and integrating Generative AI models (OpenAI, Hugging Face, or similar).
Solid understanding of data governance frameworks and compliance requirements (GDPR, CCPA).
Proficiency in Python and SQL; familiarity with PySpark or other distributed processing frameworks is a plus.
Experience with cloud platforms (Azure, AWS) and infrastructure-as-code tools (Pulumi, Terraform).
Strong CI/CD skills with Azure Pipelines or equivalent.
Excellent communication skills, with the ability to interact effectively with technical and non-technical stakeholders
Strong problem solving skills, with a continuous learning mindset.
Benefits
Flexible and hybrid working
$500 every year to spend on your wellbeing
Take an extra Annual Leave day off on us every financial year, with A Day on Wisr
Access via WHEREFIT to discounted gym memberships, corporate discounts for wellbeing products and more!
Generous paid parental leave to support your transition to parenthood
Regular social events and awesome team offsites
Access to our Employee Assistance Program, Uprise with up to 6 coaching sessions per year
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.
Manager leading coordination between teams for data engineering at DPR Construction. Supporting core markets and account management through data analytics and technical delivery.
Program Manager leading enterprise - wide data migration efforts for Boeing's transition to modern data platforms. Overseeing complex processes to ensure secure and effective migrations across multiple systems.
Senior Data Engineer responsible for developing data products for Disney's immersive digital experiences. Collaborating with teams to ensure data quality and operational efficiency in a fast - paced environment.