Data Engineer at DyFlex Solutions designing and optimizing enterprise-scale data solutions. Engaging clients and leading development teams to unlock data value and performance.
Responsibilities
As a Data Engineer, you will design, build, and optimise enterprise-scale data solutions, helping our clients unlock the value of their data and accelerate performance.
Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
Manage and optimise databases, warehouses, and cloud storage solutions
Implement data quality frameworks and testing processes to ensure reliable systems
Design and deliver cloud-based solutions (AWS, Azure, or GCP)
Take technical ownership of project components and lead small development teams
Engage directly with clients, translating business requirements into technical solutions
Champion best practices including version control, CI/CD, and infrastructure as code
Requirements
Hands-on data engineering experience in production environments
Strong proficiency in Python and SQL; experience with at least one additional language (e.g. Java, Typescript/Javascript)
Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
Background in building ML pipelines, MLOps practices, or feature stores is highly valued
Proven expertise in relational databases, data modelling, and query optimisation
Demonstrated ability to solve complex technical problems independently
Excellent communication skills with ability to engage clients and stakeholders
Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Benefits
Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
A flexible and supportive work environment including work from home
Competitive remuneration and benefits including novated lease, birthday leave, remote working, additional purchased leave, and company-provided laptop
Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
Structured career advancement pathways with mentoring from senior engineers
Exposure to diverse industries and client environments
Join a renowned organisation delivering projects to some of Australia’s leading enterprises
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.