Senior Data Engineer at Multiverse creating data infrastructure and APIs for AI and Tech adoption. Collaborate with teams to ensure efficient access to product data and support business needs.
Responsibilities
Take ownership of our data architecture. You will define schemas for new entities and refactor existing models to improve performance and clarity as the product evolves.
Create APIs and connectors that allow users to access data easily without needing deep infrastructure knowledge.
Enable the "Universal Data Layer" that powers our AI agents and internal services.
Build features in our Internal Developer Platform that make it easy to deploy and manage AI models.
Remove the friction between "training a model" and "running it in production."
Automate security and compliance checks so that data is classified and safe by default.
Replace manual approval gates with automated guardrails, ensuring speed without compromising safety.
Design and develop services that wrap complex business logic into clean, reusable APIs.
Create a "Productised Data" layer, making it easy for non-technical stakeholders to pull high-fidelity reports and build dashboards without needing to understand the underlying raw tables.
Transition legacy data scripts and coupled domains into robust, version-controlled services that the entire company can rely on.
Requirements
A solid grasp of system design and data structures, coupled with foundational software engineering background. You write code that is tested, modular, and readable (we use Python, TypeScript, and Go).
Proficiency with cloud-native development (AWS or Azure) and containerisation (Kubernetes/Docker).
Experience with Cloud Data Warehouses (Snowflake/Postgres), Data Warehouse API (GraphQL).
Experience in data engineering, including building and maintaining data pipelines. You treat data pipelines like software.
Experience implementing CI/CD (CircleCI/GitHub Actions/GitLab), automated testing, and Data Observability (Datadog).
You can articulate ideas clearly to both engineers and product partners.
You have experience with platform engineering principles, you can demonstrate empathy for users, prioritising usability, configurability and long-term sustainability.
You care deeply about code quality, testing, and documentation, and you aim to build systems that are easy to understand and operate.
You are comfortable refactoring monolithic data structures into modular services that prioritise ease of use for the end consumer.
Interest or experience in building infrastructure for GenAI such as Vector Databases or MCPs (Model Context Protocols).
Familiarity with event-driven architectures is a bonus.
Benefits
27 days holiday, plus 5 additional days off: 1 life event day, 2 volunteer days, 2 company-wide wellbeing days (M-Powered Weekend) and 8 bank holidays per year
private medical Insurance with Bupa, a medical cashback scheme, life insurance, gym membership & wellness resources through Wellhub and access to Spill - all in one mental health support
Hybrid work offering - for most roles we collaborate in the office three days per week with the exception of Coaches and Instructors who collaborate in the office once a month
Work-from-anywhere scheme - you'll have the opportunity to work from anywhere, up to 10 days per year
Space to connect: Beyond the desk, we make time for weekly catch-ups, seasonal celebrations, and have a kitchen that’s always stocked!
Senior Data Engineer developing ingestion, ETL & ELT processes with Azure Synapse at Indra Group. Supporting data solutions and compliance in a collaborative hybrid work environment.
Senior Data Engineer building data pipelines for business intelligence at Getsafe. Involves designing and maintaining a Data Warehouse platform using tools like SQL and Python in a hybrid work environment.
Big Data Architect involved in the Data Engineering Team, drafting architectures and sizing efforts. Collaborating with key team members to ensure data platform efficiency and best practices.
Data Engineer developing automated data pipelines for Creditreform. Collaborating with Data Scientists and Analysts and ensuring data quality and consistency.
Data Engineer building and maintaining core data infrastructure for Aircall's AI customer communications platform. Support data reliability and scalability for high - impact use cases.
Senior Data Engineer shaping data landscape and leading ETL/ELT solutions for mission - critical engineering projects in the UK. Collaborating with teams to deliver high - quality data products.
Data Engineer developing and maintaining ETL processes with Informatica IDMC for BMW TechWorks Romania. Collaborating with teams to implement cloud - based data architecture solutions.
Microsoft Data Architect designing and building business intelligence solutions on Azure. Collaborating with teams and engaging clients for strategic data solutions.
Senior Data Engineer developing data - pipelines and deploying AI tools for Mariner's ecosystem. Collaborating with cross - departmental teams to enhance enterprise data products and systems integrations.