Data Engineer developing and maintaining data infrastructure for analytics at OAG. Collaborating with teams to ensure high data reliability and performance across platforms in the travel industry.
Responsibilities
Designing, building, and maintaining data pipelines using Azure and Airflow to support ingestion, transformation, and delivery of data from diverse sources
Developing, documenting, and optimizing dbt models and SQL transformations in Snowflake, ensuring data is well-structured, reusable, and performant for downstream analytics
Implementing and maintaining automated data quality checks, validation frameworks, and monitoring processes to ensure accuracy, completeness, and timeliness of data
Collaborating with product owners and analytics stakeholders to translate business requirements into scalable technical solutions
Contributing to Power BI data model development and supporting analysts in building efficient and performant dashboards
Working closely with engineering peers to improve internal tooling, develop reusable components, and enforce best practices across the data stack
Supporting ongoing documentation, metadata management, and data governance efforts to increase transparency and reduce rework
Requirements
Experience in a Data Engineering role in a modern data stack environment, preferably within a data quality or analytics-focused team
Demonstrated ability to design and implement data pipelines, build reliable data models, and apply data quality techniques
Hands-on experience working in cloud-native environments (AWS preferred)
Comfortable working in Agile development teams and contributing to backlog refinement and sprint planning
Proven ability to partner across engineering and product teams to deliver business value from data
Benefits
Company-provided free lunch every day
Flexible working arrangements that allow remote work or office attendance
An attractive compensation and benefits package, including private health insurance and company bonus scheme
Voluntary participation in a company-supported retirement scheme
A generous annual leave policy that grows with each year of service, plus a day off during your birthday month
Work remotely from anywhere in the world for 1 month each year
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.
Technical Lead for data engineering and reporting in healthcare technology at Dedalus. Shaping innovative software solutions and leading cross - functional technical teams in Australia.
Senior ML Data Engineer working on data pipeline curation for Mobileye's autonomous vehicle dataset. Collaborating across teams to enhance ML engineering and vision model applications.
Data Engineer managing customer datasets to enhance industrial research and development. Responsible for ETL pipelines and data ingestion for the Uncountable Web Platform.
Data Engineer designing and maintaining scalable data solutions on Databricks for clinical trials. Collaborating with teams to overcome data challenges and ensure the smooth logistics of clinical supplies.
Senior Manager leading a team of database engineers to manage CCC's data platform. Overseeing mission - critical applications and collaborating with cross - functional teams in a hybrid environment.
As a Principal Data Architect at Solstice, lead the design and implementation of data architecture solutions. Ensure data integrity, security, and accessibility to meet strategic organizational goals.
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.