About the role

  • Build and manage data pipelines that ingest, transform, and serve data across the business
  • Contribute to the design and implementation of cloud-based infrastructure using Terraform
  • Work on complex data challenges, balancing short-term delivery with long-term platform evolution
  • Collaborate with data product owners and stakeholders to collect and refine data requirements
  • Optimise data storage, infrastructure performance, and cost within our data platform
  • Support the analytics data layer by enabling clean, reliable data for downstream use in Looker and BigQuery
  • Collaborate with analytics engineers, analysts, and data scientists to support their data use cases
  • Contribute to the evolution of our self-serve data platform and the implementation of our data strategy
  • Participate in agile ceremonies, including sprint planning, refinement, and retrospectives
  • Promote and practice excellent data and cloud engineering best practices, including testing, documentation, and observability

Requirements

  • Has 3–5 years of proven experience in data engineering, working on production-grade data pipelines and infrastructure within a large-scale, cloud-based data platform
  • Has worked in a complex organisation with a mature or evolving data platform (e.g. BigQuery, Looker)
  • Has strong Python coding skills and is confident writing complex SQL queries
  • Has demonstratable experience, using Terraform to successfully manage infrastructure as code in a large scale platform
  • Has hands-on experience with GCP (BigQuery, GCS, Dataflow, Cloud Composer) or AWS (Redshift, S3, AWS Glue)
  • Experience with dbt and understands data modelling principles and best practices
  • Deep understanding of data storage, modelling, and orchestration concepts
  • Has demonstratable experience setting up and managing data pipelines end-to-end
  • Brings a collaborative and curious mindset, and enjoys working in cross-functional teams
  • Optimistic, curious and gets excited by tech, enjoys keeping up with current trends. Actively gets involved and enjoys contributing to tech events
  • Is proactive, communicative, and comfortable contributing to technical discussions and design decisions
  • Experience working with version control (Git) and CI/CD practices
  • Awareness of data quality, privacy, and security best practices
  • Knowledge of GDPR and data security best practices, as well as frameworks for testing, monitoring, and alerting in relation to data pipelines

Benefits

  • Cash plan for dental, optical and physio treatments
  • Private Medical Insurance, Pension and Life Insurance, Employee Assistance Plan
  • 27 days holiday plus two (paid) volunteering days a year to give back, and holiday buy schemes
  • Hybrid working pattern with 2 days in office
  • Contributory stakeholder pension
  • Life assurance at 4x your basic salary to a spouse, family member or other nominated person in your life
  • Competitive compensation package
  • Paid leave for maternity, paternity, adoption & fertility
  • Travel Loans, Bike to Work scheme, Rental Deposit Loan
  • Charitable contributions through Payroll Giving and donation matching
  • Access deals and discounts on things like travel, electronics, fashion, gym memberships, cinema discounts and more

Job title

Data Engineer

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job