Senior BigQuery Engineer designing high-performance data solutions for Deutsche Bank. Collaborating in agile environment for data quality and cloud migration.
Responsibilities
Design, develop, and maintain scalable data warehouse solutions in BigQuery, including dataset architecture, schema design, and data modeling (star/snowflake)
Design and implement a cloud-native data warehouse to replace on-premises infrastructure
Optimize performance through partitioning, clustering, query tuning, and workload management
Build data processing pipelines and workflows feeding BigQuery from multiple sources
Implement data quality checks, validation frameworks, and reconciliation processes
Work closely with product owners, analysts, to ensure solution align with the business requirements specification
Requirements
University degree in computer science or a comparable qualification
At least 5 years of data engineering experience
Strong SQL knowledge, preferably on Google Cloud Platform (GCP)
Experience orchestrating workflows with Cloud Composer (Apache Airflow), including DAG development, scheduling, monitoring, and dependency management
Hands-on experience building data processing pipelines using Dataflow (batch and streaming)
Good understanding of data warehousing concepts, data flows, data feeds
Experience using Bitbucket for Git-based source control and collaboration
Ability to work collaboratively in a dynamic environment
Bonus Skills: Hands-on experience with Python/PySpark for scalable distributed data processing
Experience with Infrastructure as Code (Terraform, Ansible, Chef)
Knowledge of shell scripting
Experience in financial services or regulated environments
Benefits
New beginnings can be a challenge. We promise a smooth integration and a supportive mentor
Pick your working style: choose from Remote, Hybrid or Office work opportunities
Early bird or night owl? Our projects have different working hours to suit your needs
Nobody is born an expert. Sharpen your tech skills with our sponsored certifications, trainings and top e-learning platforms
We want you to stay healthy! Enjoy our Private Health Insurance – it’s custom-made for you
A clear mind is a healthy mind. Attend individual coaching sessions or go one step further by joining our accredited Coaching School
Make the most of our epic parties or themed events – they’re lovingly designed for our people and their families
IT Data Engineer passionate about data solutions supporting digital transformation at Sizewell C. Join a collaborative team working on building data pipelines and platforms for a major infrastructure project.
Cloud & Data Engineer working with large datasets in innovative projects for Marketing Technology team. Focus on cloud platforms and development of scalable systems for digital marketing support.
Data Engineer responsible for building and maintaining data solutions using Microsoft Fabric. Working within a consultancy environment to meet client expectations across various sectors.
Data Engineering Lead at Fetch owning end - to - end data platform for AI, pricing, and operations. Collaborate with teams to enable real - time data - driven decisions and trustworthiness.
Data Engineer responsible for building ELT/ETL pipelines and supporting data governance practices at Daniels Health. Joining a mission - driven company innovating in healthcare waste management across multiple countries.
Data Engineer designing and optimizing Azure - based data platforms for enterprise analytics. Developing scalable data pipelines and enabling insights through Power BI and Azure Synapse Analytics.
Senior Software Engineer focused on ingestion pipeline at Fullstory. Engineering distributed systems for processing data at scale while collaborating with technical leaders.
Junior Data Engineer contributing to data solutions in home24's Martech team. Focus on data pipelines, analytical workflows, and machine learning model scaling with cross - functional collaboration.
Data Engineer at Onepoint developing cloud - native architectures and scalable data solutions. Collaborating on data processing pipelines and guiding clients on best practices.