GCP Data Engineer developing and maintaining scalable data pipelines using Google Cloud Platform at Vodafone. Collaborating with teams to optimize data processes and ensure quality.
Responsibilities
Design, develop, and maintain scalable data pipelines and ETL processes using GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions, and Cloud Run
Collaborate with data scientists, analysts, and other stakeholders to understand data needs and build effective solutions
Implement data ingestion, processing, and storage solutions for structured and unstructured data from varied sources
Optimise and tune data pipelines for performance, reliability, and cost efficiency
Ensure data quality and integrity through validation, cleansing, and transformation
Develop and maintain data models, schemas, and metadata to support analytics and reporting
Monitor, diagnose, and resolve data pipeline issues promptly
Stay current with GCP technologies and recommend improvements
Mentor junior engineers and encourage knowledge sharing across the team
Requirements
Degree or higher qualification in Computer Science, Information Technology, or a related field
3–5 years of experience in data engineering with a focus on Google Cloud Platform
Proficient in BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer, Cloud Functions, and Cloud Run
Strong programming skills in Python and PL/SQL
Experienced with SQL and NoSQL databases
Knowledgeable in data warehousing concepts and data integration frameworks
Excellent analytical, problem‑solving, and communication skills
Collaborative, adaptable, and able to operate in dynamic environments
Able to guide peers and contribute constructively to technical discussions
Benefits
Opportunity to work on a strategically significant greenfield project
Exposure to modern cloud-native data engineering practices using GCP
Ability to shape frameworks, standards, and best practices for the wider team
Collaboration across cross‑functional, international teams
Data Engineer responsible for building ELT pipelines and operating data platforms at Auctionet. Collaborate closely with analytics and infrastructure teams in Stockholm.
Data Engineer responsible for industrializing scalable AI solutions for a recognized French scale - up. Collaborating on data engineering projects, optimizing data pipelines, and mentoring junior engineers.
Working Student in Data Engineering at Windtastics, enhancing data workflows and automation processes. Focus on Airflow - DAGs, data modeling, and team support.
Data Engineer designing and implementing end - to - end data processes, leveraging cloud - based infrastructures. Join Capgemini to work on sustainable and inclusive technology solutions.
Senior Data Engineer at Setpoint developing important data systems for asset - backed lending. Owning full - stack features from design to deployment with a modern tech stack in a hybrid role.
Senior Data Engineer responsible for evolving data pipeline at Mytra, enhancing supply chain solutions through data - driven insights and collaboration.
Data Engineer developing data pipelines and stream processing solutions for Leonardo in the Cyber & Security Solutions area. Supporting data ingestion, processing, and analytics for large - scale datasets.
Manager, Data Engineering leading the cloud - native data engineering vision at Grainger. Developing scalable platforms and mentoring data engineers to enhance quality and business impact.
Big Data Engineer optimizing scalable data solutions using Hadoop, PySpark, and Hive at Citi. Responsible for building ETL pipelines and ensuring data quality in a hybrid work environment.