Data Engineer at DyFlex Solutions designing and optimizing enterprise-scale data solutions. Engaging clients and leading development teams to unlock data value and performance.
Responsibilities
As a Data Engineer, you will design, build, and optimise enterprise-scale data solutions, helping our clients unlock the value of their data and accelerate performance.
Build and maintain scalable data pipelines for ingesting, transforming, and delivering data
Manage and optimise databases, warehouses, and cloud storage solutions
Implement data quality frameworks and testing processes to ensure reliable systems
Design and deliver cloud-based solutions (AWS, Azure, or GCP)
Take technical ownership of project components and lead small development teams
Engage directly with clients, translating business requirements into technical solutions
Champion best practices including version control, CI/CD, and infrastructure as code
Requirements
Hands-on data engineering experience in production environments
Strong proficiency in Python and SQL; experience with at least one additional language (e.g. Java, Typescript/Javascript)
Experience with modern frameworks such as Apache Spark, Airflow, dbt, Kafka, or Flink
Background in building ML pipelines, MLOps practices, or feature stores is highly valued
Proven expertise in relational databases, data modelling, and query optimisation
Demonstrated ability to solve complex technical problems independently
Excellent communication skills with ability to engage clients and stakeholders
Degree in Computer Science, Engineering, Data Science, Mathematics, or a related field
Benefits
Work with SAP’s latest technologies on cloud as S/4HANA, BTP and Joule, plus Databricks, ML/AI tools and cloud platforms
A flexible and supportive work environment including work from home
Competitive remuneration and benefits including novated lease, birthday leave, remote working, additional purchased leave, and company-provided laptop
Comprehensive training budget and paid certifications (Databricks, SAP, cloud platforms)
Structured career advancement pathways with mentoring from senior engineers
Exposure to diverse industries and client environments
Join a renowned organisation delivering projects to some of Australia’s leading enterprises
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.