About the role

  • Senior BigQuery Engineer designing and implementing cloud-native data solutions for financial datasets at Deutsche Bank. Collaborating in an agile environment with a focus on data quality.

Responsibilities

  • Design, develop, and maintain scalable data warehouse solutions in BigQuery
  • Design and implement a cloud-native data warehouse
  • Optimize performance through partitioning, clustering, query tuning, and workload management
  • Build data processing pipelines and workflows feeding BigQuery
  • Implement data quality checks, validation frameworks, and reconciliation processes
  • Work closely with product owners and analysts

Requirements

  • University degree in computer science or a comparable qualification
  • At least 5 years of data engineering experience
  • Strong SQL knowledge, preferably on Google Cloud Platform (GCP)
  • Experience orchestrating workflows with Cloud Composer (Apache Airflow)
  • Hands-on experience building data processing pipelines using Dataflow
  • Good understanding of data warehousing concepts
  • Experience using Bitbucket for Git-based source control
  • Nice to have: Hands-on experience with Python/PySpark
  • Knowledge of shell scripting
  • Experience in financial services or regulated environments

Benefits

  • Private Health Insurance
  • Sponsored certifications and trainings
  • Flexible working hours
  • Individual coaching sessions
  • Company parties and themed events

Job title

Data Engineer, Python, BQ

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job