Data Engineer responsible for building and maintaining data solutions on AWS and GCP. Focus on Lakehouse architecture to support analytics, reporting, and AI/ML use cases.
Responsibilities
Build and maintain Lakehouse‑based data pipelines using object storage and open table formats
Develop batch and streaming pipelines using AWS and GCP services
Implement medallion architecture patterns for structured data processing
Implement curated and consumption‑ready datasets as part of domain‑oriented data products
Support development of analytical data models optimized for BI and reporting use cases
Contribute to CI/CD pipelines for data pipeline deployments
Collaborate with data engineers, platform teams, and DevOps to deliver end‑to‑end data solutions.
Requirements
2–4 years of experience in data engineering or related roles
Hands‑on experience building data pipelines on cloud platforms
Experience with AWS data services (e.g., S3, Glue, Kinesis, Athena)
Experience with GCP data services (e.g., GCS, Dataproc, Dataflow, BigQuery)
Familiarity with Lakehouse concepts and open table formats such as Apache Iceberg
Proficiency in Python and SQL
Understanding of batch and streaming data processing concepts.
Benefits
Enjoy your best years with our retirement savings plan
Have peace of mind and body with our health insurance
Make yourself a priority with flexible schedules, parental leave and our holiday purchase scheme
Drive forward your career through professional development opportunities
Achieve your personal goals with our Employee Assistance Programme.
Digital Analytics Capability - Adobe Data Engineer at Bankwest. Supports Adobe Experience Cloud applications and contributes to digital analytics capabilities.
Data Engineer designing and maintaining data pipelines at Agile Defense. Focused on integrating technologies for vital national security and civilian missions.
Senior Data Engineer responsible for Cloud Data Lake design and development in financial services. Collaborate with management and technical teams to execute technology product roadmap.
Senior Data Architect for the Chief Data Office at State Street driving Global Data Architecture and Management. Collaborating with stakeholders to streamline data processes and enhance data quality.
Senior Data Engineer responsible for reliable data integration and transformation at Evertec. Collaborating with stakeholders and engineering teams using SQL and cloud platforms.
Data Engineer needed to build and maintain cloud data pipelines for a digital transformation consultancy. Collaborate on analytics initiatives and ensure data quality in São Paulo.
Senior Data Engineer at Verity designing and evolving data architectures. Building scalable pipelines and implementing data engineering best practices in cloud environments.
Data Engineer Pleno at Verity focusing on building data pipelines and analytics frameworks. Engage with teams to ensure data quality and enhance cloud - based solutions.
Data Engineer for Verity, a digital transformation consultancy, designing data architectures and building scalable pipelines. Collaborates on data quality and analytical dataset structuring.
Data Engineer at GFT managing data systems and workflows, focusing on data engineering and data science collaboration. Leveraging technologies like Python, Airflow, and AWS.