Lead Data Engineer overseeing the migration from Redshift to AWS RDS and mentoring data analysts in Python and DataOps practices.
Responsibilities
Lead new Data Pipeline Delivery: Implement the new Python-based ETL pipelines for data movement from Luna’s microservice DynamoDB tables to RDS transactional DB (Postgres OLTP) and subsequently to the RDS data warehouse (Postgres OLAP).
Platform Optimization: Leverage expertise from previous data setups to optimize the new Postgres OLTP for transactional performance and the OLAP environment for analytics effectiveness, with both being conscious of on-going cost management.
Standard Setting: Establish and enforce DataOps standards, including version control (Git), automated CI / CD deployment, and schema migration management using tools like Liquibase or Drizzle ORM.
Hands-on Coaching: Actively mentor team members, elevating their skills in Python, Git, and engineering workflows through code reviews and workshops.
Code Quality: Conduct rigorous code reviews, providing detailed, educational feedback that explains best practices and clearly outlines required changes to elevate team standards.
AI Engineering patterns: Support the definition and implementation of AI tooling and practices to augment the Data engineering team.
Initiative Planning: Break down larger reporting initiatives into manageable epics and stories within our Agile framework (a mix of Scrum for larger work items, and Kanban for smaller continuous flow items).
Stakeholder Management: Network with business domains to capture requirements and provide strategic guidance on the data platform migration plan.
Product Partnership: Support the Data Product Owner by providing the technical context necessary to prioritize the team's backlog effectively.
Requirements
Data Engineering - 7 to 10 years experience building and supporting full-stack data pipelines from source to reporting.
AWS Mastery - 5+ years deep experience with AWS, ideally administration and optimization of RDS / Aurora (Postgres) and Redshift.
Python - 5+ years expert-level Python, with specific experience working on ETL pipelines, including libraries like Pandas and orchestration tools like Airflow or Prefect.
PostgreSQL - 5+ years expert-level SQL and database architecture (partitioning, indexing) for both OLTP and OLAP.
DataOps / Tools - 3+ years expertise in Git, with solid understanding of Git branching and workflow, and schema management (e.g., Liquibase, Flyway, or Drizzle).
Benefits
25 days holiday allowance + bank holidays
Share scheme
A £1000 flexifund to use on a personalised list of benefits such Gym membership, Cycle to Work Scheme, Health, dental and optical cash plan
Senior Data Engineer building and optimizing data pipelines for Garner Health. Seeking a candidate with experience in AWS, SQL, and Python with a mission - driven mindset.
Data Engineer (GCP) designing and maintaining scalable data platforms at LUZA Group in Portugal. Collaborating and ensuring data integrity across multiple complex datasets.
Data Architect at Integrant responsible for designing and building data solutions for analytical purposes. Involves eliciting requirements, data pipelines, and coaching teams on methodologies.
Senior Data Engineer developing and maintaining data pipelines for clients in an Agile setting. Collaborating with teams to enhance data quality and mentoring junior engineers.
Senior Data Engineer designing and maintaining scalable data pipelines using modern technologies. Collaborating with cross - functional teams and providing mentorship in a dynamic environment.
Data Architect leading design and implementation of cloud data platforms for digital transformation. Collaborating with stakeholders to define data strategies and governance models.
Data Engineer Consultant designing and optimizing data infrastructure for clients' business needs. Working with SQL and data visualization tools in a mainly remote role with some onsite responsibilities in Denver.
Data Engineer creating Real - Time Data Processing applications for a leading iGaming operator. Work involves stream data manipulation and collaboration in an Agile environment.
Data Engineer at Voodoo optimizing real - time data pipelines for gaming and consumer apps to support growth. Joining a top - tier data team dedicated to monetizing via advertising partners in a competitive landscape.
Cloud Data Engineer designing data architectures for cloud platforms at fifty - five. Collaborating with local and global teams to optimize marketing ROI and customer experience.