Onsite Data Engineer

Posted 3 hours ago

Apply now

About the role

  • Data Engineering Professional at BT Group enabling data-driven decision making through scalable data solutions. Responsible for development, testing, and deployment of ETL/ELT pipelines.

Responsibilities

  • Deliver the development, testing, and deployment of ETL/ELT data pipelines for ingesting, transforming, and curating data, ensuring adherence to data security and privacy standards.
  • Contribute to designing and enhancing data warehouse architecture and reusable engineering patterns for processing and storing large scale structured and unstructured datasets.
  • Support major engineering initiatives by shaping technical approaches and contributing innovative solutions to complex or ambiguous problems.
  • Produce business relevant, high quality data outputs that strengthen the organisation’s data assets and overall data engineering capability.
  • Apply engineering best practices and Agile delivery methodologies to ensure consistent, high quality, and iterative delivery of data engineering solutions.
  • Execute initiatives to build and improve data and analytics infrastructure, enabling scalable and reusable data engineering capabilities.
  • Identify and implement improvements to data engineering processes by analysing data, optimising workflows, and driving efficiency gains.
  • Conduct thorough quality assurance and testing of data pipelines, models, and engineering frameworks to ensure accuracy, reliability, and fitness for business use.

Requirements

  • Handson experience with key Google Cloud Platform services such as BigQuery, Cloud Storage, Cloud Composer, Dataflow, and Pub/Sub
  • Proficiency in SQL, including writing complex queries, optimizing performance, handling large datasets, and implementing data validation and quality checks across analytical platforms
  • Experience in designing and implementing Data Warehouse solutions across different platforms, including schema design, ETL/ELT patterns and performance tuning
  • Experience in Python for building data pipelines and automation using cloud services, APIs, and SDKs, with an understanding of modular coding, error handling, and reusable components
  • Experience with Terraform for infrastructure as code; willingness and ability to learn quickly is essential.
  • Understanding of DevOps principles with hands-on experience in version control, CI/CD pipelines, automation, and defect management
  • Mandatory experience with GitLab for code management and deployments
  • Working knowledge of Agile delivery practices and tools such as Jira and Confluence, with the ability to collaborate in fast paced, iterative development cycles.
  • Strong analytical and communication skills, with the ability to interpret complex data and present insights in clear, user friendly formats for technical and nontechnical stakeholders.
  • Experience managing deadlines and delivering high quality work under pressure in dynamic and fastmoving environments.
  • Experience creating processes and documentation, including developing standard operating procedures, architectural diagrams, and technical documentation.
  • Ability to work effectively with multiple stakeholders, collaborating across engineering, product, business, and operations teams to drive aligned outcomes.
  • Experience in data quality frameworks, including implementing data validation, monitoring, and continuous improvement techniques.
  • Familiarity with incident and service management tools such as ServiceNow for handling production issues, change management, and operational workflows.

Benefits

  • Flexible working hours
  • Professional development opportunities

Job title

Data Engineer

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job