Senior Data Engineer responsible for building and optimizing ETL pipelines for a leading travel data platform. Collaborating with software and data engineers to modernize data ingestion and processing.
Responsibilities
Design, build, and optimize scalable ETL and Structured Streaming pipelines in Azure Databricks for real-time and batch ingestion of Flight Status data
Design and implement data ingestion and processing pipelines that consolidate heterogeneous data sources, including APIs, event streams, and file-based feeds, into the OAG lakehouse (Azure Databricks + Delta Lake), ensuring data consistency, reliability, and scalability
Implement and monitor data quality using automated validation, alerting, and observability practices
Develop and maintain orchestration workflows in Apache Airflow, coordinating ingestion and transformation processes across multiple data flows
Build reusable frameworks for schema evolution, error handling, deduplication, and auditing
Collaborate with data platform, analytics, and product teams to define SLAs, data contracts, and performance targets
Optimize Spark and Delta Lake performance for scalability, latency, and cost efficiency
Implement CI/CD pipelines and automation for data workflows using Azure DevOps or equivalent tools
Mentor engineers, review code, contribute to platform design discussions and planning, and help grow data engineering competencies in the team and across OAG
Requirements
Proven track record in data engineering with a strong focus on ETL development and streaming data architectures
Experience with Azure Databricks, Apache Spark (Structured Streaming), and Delta Lake
Proficiency in Python (PySpark) and SQL, with experience transforming large-scale, complex datasets
Hands-on experience in data orchestration and workflow automation (e.g., Apache Airflow or similar)
Experience working in a cloud data environment (preferably Azure) across storage, compute, and pipeline services
Familiarity with streaming or messaging technologies (e.g., Kafka, Event Hubs)
Strong understanding of data quality, validation, and observability practices
Ability to deliver production-grade solutions with a results-oriented and ownership-driven mindset
Experience implementing CI/CD and version-control practices using Azure DevOps, GitHub Actions, or similar tools
Excellent analytical, communication, and collaboration skills
Strong understanding of modern data engineering patterns and ability to design scalable, modular, and reliable data systems
Benefits
Company-provided free lunch every day
Private health insurance
Company bonus scheme
Voluntary participation in a company-supported retirement scheme
Generous annual leave policy that grows with each year of service
Data Engineer at Grupo Iter responsible for data pipelines and architecture in Azure. Collaborating on data governance and integrating analytics with Power BI.
Full Stack Data Architect for Concurrency designing Azure data - intensive applications. Leading complex data architecture initiatives and mentoring engineering teams in a high - performance environment.
AHEAD builds digital business platforms; seeking a Data Engineer in a development program. Join us to grow into a technical leader emphasizing skills across various practices.
Data Engineer creating clean, reliable data pipelines for Plenti, a fintech lender. Collaborating with modern tools like AWS and Databricks to enhance data quality and analytics.
Data Platform Specialist overseeing data quality and platform operations at Stackgini. Collaborating with teams to enhance data management solutions and improve system performance.
Staff Data Engineer at PPRO transforming data ecosystem into a self - service platform. Leading technical vision for data engineering and building scalable infrastructures.
SSIS Data Engineer at iKnowHow Group focusing on data migration projects. Involves data modeling, integration, and using T - SQL/SQL alongside SSIS packages.
Principal Data Engineer designing and implementing data solutions that ensure trust and transparency in supply chains. Collaborating with global teams and mentoring fellow engineers in data practices.