Data Engineer developing scalable data solutions across multi-cloud environments for clients. Mentoring junior engineers while ensuring data quality and promoting best practices within the team.
Responsibilities
Design, build, and optimise scalable Databricks Lakehouse solutions across AWS, Azure, and GCP
Develop robust data ingestion, transformation, and orchestration pipelines using Databricks (Spark, Delta Lake, Workflows)
Build high-quality data models to support analytics, reporting, and AI/ML use cases
Implement medallion architectures (bronze, silver, gold) and modern data engineering patterns
Collaborate closely with clients to translate business requirements into well-architected, actionable data solutions
Support or implement dbt, CI/CD pipelines, Git-based workflows, and engineering best practices
Ensure strong data quality, governance, lineage, security, and performance optimisation within Databricks environments.
Work alongside analytics, governance, and AI consultants to deliver cohesive, end-to-end solutions
Contribute to reusable assets, accelerators, and internal frameworks that strengthen Intelligen’s Databricks capability
Mentor junior engineers and positively influence client delivery and engineering standards
Requirements
4–6+ years’ experience in data engineering or analytics engineering
Strong hands-on experience with Databricks (Spark, Delta Lake, Workflows), ideally in production environments
Experience with at least one major cloud platform: AWS, Azure, or GCP
Strong SQL skills and experience building complex data transformations
Familiarity with modern data stacks — e.g. Databricks, Snowflake, dbt, cloud data lakes, orchestration tools
Experience working across the full data lifecycle: ingestion → transformation → modelling → consumption
Consulting, stakeholder-facing experience, or cross-functional delivery exposure
Knowledge of DevOps concepts, version control, and/or CI/CD in data environments
Excellent communication, problem-solving, and collaboration skills
Sydney-based, with ability to work on-site with clients as required
A mindset of curiosity, delivery excellence, and continuous learning
Benefits
Work From Home - Flexible hours
Training & Development
Free Food & Snacks
Many socials and community groups
Opportunity to drive projects that are of interest to you!
Director of Data Engineering leading data architecture and analytics at Petfolk. Overseeing data infrastructure and managing a data team to drive AI and business intelligence solutions.
Senior Data Engineer managing end - to - end data pipelines with Google Cloud Platform. Collaborating closely with product teams to deliver scalable data solutions in a hybrid setting.
GCP Data Engineer designing, building, and optimising data solutions on Google Cloud Platform. Collaborating with clients to solve complex data challenges and enhance AI capabilities.
Consultant Data Engineer for modern data transformations at Intelligen. Work on dbt and Snowflake projects for enterprise clients to optimize data pipelines.
Data Engineer designing and delivering modern data solutions across multi - cloud environments for clients in Australia. Collaborating and mentoring while contributing to meaningful projects in a high - performing team.
Principal Product Manager leading GEICO's Customer Data Platform development and strategy. Collaborating with cross - functional stakeholders to improve customer engagement through data - driven solutions.
Senior Product Manager at GEICO managing the Customer Data Platform, collaborating across teams for data - driven solutions. Evolving customer engagement using innovative data strategies.
Data Engineering Lead at Absa enabling analytics and AI through scalable data platforms. Overseeing engineering teams to deliver high - quality, trusted data solutions.
Senior Data Engineer in PwC's FCU Technology Team designing scalable data pipelines and collaborating on analytical solutions. Working with Databricks and supporting junior engineers in a hybrid work model.