Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Responsibilities
Design, develop, and maintain robust ETL/ELT pipelines to integrate data from multiple sources into a centralized cloud-based data platform
Build scalable data ingestion, transformation, and enrichment processes using Python, SQL, and PySpark
Optimize data workflows for performance, scalability, and cost efficiency in the cloud
Implement data quality and validation checks to ensure trust in reporting, analytics, and data-driven products
Collaborate with cross-functional teams to translate business requirements into technical data solutions
Support large-scale transformations using distributed processing frameworks
Troubleshoot and resolve issues in data pipelines, ensuring reliability and uptime
Participate in code reviews and contribute to engineering standards and best practices
Document data processes, pipelines, and schemas to improve transparency and reusability
Stay current with modern data engineering tools, practices, and cloud technologies, with a passion for continual learning and knowledge sharing
Build with stakeholders in mind, not just raw pipelines.
Requirements
3+ years of experience in data engineering, data development, or data management
Strong hands-on experience with Snowflake and modern data warehouse concepts (data lakes, lakehouse, streaming)
Proficiency in Python and SQL for building and optimizing data pipelines
Hands-on experience with AWS services such as S3, Glue, Lambda, Redshift, and data platforms such as Snowflake
Experience with ETL/ELT, data modeling, and data warehousing concepts
Experience with orchestration tools (Airflow, Dagster)
Hands-on experience with PySpark and distributed data processing frameworks (e.g., AWS EMR)
Knowledge of pipeline performance optimization and debugging
Strong problem-solving, analytical, and collaboration skills
Experience with version control (Git) and CI/CD workflows
Data Engineer designing and implementing scalable data architecture for HR and people analytics. Collaborating with teams to ensure reliable data pipelines and integration using modern technologies.
Senior Data Engineer architecting and maintaining scalable data systems while collaborating with cross - functional teams at SpotOn, aimed at empowering independent restaurants.
Analytics & Data Engineer joining a data - driven team at Adlook, building data products and automating data pipelines. Collaborate across teams to enhance data analysis and AI functionality.
Data Engineer building data foundations for People Analytics at Notion. Designing data systems and collaborating with People leadership to enhance workforce decision - making.
T&T Consultant driving the growth of AI & Data solutions at Deloitte Southeast Asia. Leading client transformations through data, analytics, and intelligent automation.
Lead the Artificial Intelligence & Data team at Deloitte Southeast Asia. Drive IA business growth and deliver innovative solutions in data and analytics.
Data Engineer building data solutions for Travelers' analytics team. Responsible for designing, building, and operationalizing complex data pipelines with a focus on AI and Machine Learning.
Senior Data Engineer designing and deploying complex data products in a collaborative environment focused on continuous improvement. Leading initiatives to optimize data solutions and ensuring technical excellence.
Data Engineer working closely with clients to develop sustainable data solutions. Engaging in consultancy and hands - on implementation within a hybrid role in Bruvo.