Build and manage data pipelines that ingest, transform, and serve data across the business
Contribute to the design and implementation of cloud-based infrastructure using Terraform
Work on complex data challenges, balancing short-term delivery with long-term platform evolution
Collaborate with data product owners and stakeholders to collect and refine data requirements
Optimise data storage, infrastructure performance, and cost within our data platform
Support the analytics data layer by enabling clean, reliable data for downstream use in Looker and BigQuery
Collaborate with analytics engineers, analysts, and data scientists to support their data use cases
Contribute to the evolution of our self-serve data platform and the implementation of our data strategy
Participate in agile ceremonies, including sprint planning, refinement, and retrospectives
Promote and practice excellent data and cloud engineering best practices, including testing, documentation, and observability
Requirements
Has 3–5 years of proven experience in data engineering, working on production-grade data pipelines and infrastructure within a large-scale, cloud-based data platform
Has worked in a complex organisation with a mature or evolving data platform (e.g. BigQuery, Looker)
Has strong Python coding skills and is confident writing complex SQL queries
Has demonstratable experience, using Terraform to successfully manage infrastructure as code in a large scale platform
Has hands-on experience with GCP (BigQuery, GCS, Dataflow, Cloud Composer) or AWS (Redshift, S3, AWS Glue)
Experience with dbt and understands data modelling principles and best practices
Deep understanding of data storage, modelling, and orchestration concepts
Has demonstratable experience setting up and managing data pipelines end-to-end
Brings a collaborative and curious mindset, and enjoys working in cross-functional teams
Optimistic, curious and gets excited by tech, enjoys keeping up with current trends. Actively gets involved and enjoys contributing to tech events
Is proactive, communicative, and comfortable contributing to technical discussions and design decisions
Experience working with version control (Git) and CI/CD practices
Awareness of data quality, privacy, and security best practices
Knowledge of GDPR and data security best practices, as well as frameworks for testing, monitoring, and alerting in relation to data pipelines
Benefits
Cash plan for dental, optical and physio treatments
Private Medical Insurance, Pension and Life Insurance, Employee Assistance Plan
27 days holiday plus two (paid) volunteering days a year to give back, and holiday buy schemes
Hybrid working pattern with 2 days in office
Contributory stakeholder pension
Life assurance at 4x your basic salary to a spouse, family member or other nominated person in your life
Competitive compensation package
Paid leave for maternity, paternity, adoption & fertility
Travel Loans, Bike to Work scheme, Rental Deposit Loan
Charitable contributions through Payroll Giving and donation matching
Access deals and discounts on things like travel, electronics, fashion, gym memberships, cinema discounts and more
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Data Engineer designing and implementing data pipelines and services for Ford Pro analytics. Working with diverse teams and technologies to drive data - driven solutions.