Data Engineering Lead at Fetch owning end-to-end data platform for AI, pricing, and operations. Collaborate with teams to enable real-time data-driven decisions and trustworthiness.
Responsibilities
Own Fetch’s data platform end-to-end – from ingestion and modelling to observability, experimentation, and AI evaluation
Design the data foundations that make AI safe, measurable, and reliable: datasets, evals, feedback loops, and monitoring that keep our agents honest
Build and own Fetch’s entire data stack – ingestion, pipelines, warehouse, observability
Partner with AI engineers to productionise agents and models with clear metrics, feedback loops, and evaluation frameworks
Make data flow in real-time across pricing, product, claims, ops, and AI
Automate everything – alerts, tests, and fail-safes so nothing breaks silently
Enable smarter pricing, sharper decisions, and clearer insight across the business
Partner with engineering and AI teams to productionise data-driven features
Requirements
5+ years building and maintaining data platforms or analytics engineering stacks at scale
Strong with Python and SQL, and comfortable with dbt, modern warehouses, and event-driven data
Experience designing reliable batch and/or streaming pipelines with strong observability and testing
Pragmatic builder – you know when to ship a simple solution and when to invest in scalable architecture
You care about product: you like to understand the business problem, challenge requirements, and push for outcomes over output
Obsessed with data quality, trustworthiness, and clear definitions (metrics, contracts, schemas)
Clear communicator, effective collaborator across engineering, product, ops, and leadership
Bonus – experience designing evaluation frameworks for AI/LLM systems (offline evals, golden sets, regression tests, monitoring)
Bonus – experience supporting AI agents or ML products (feature engineering, feedback loops, human-in-the-loop systems)
Bonus – experience in insurance, healthcare/veterinary, fintech, or other regulated environments
Benefits
Competitive Series A salary + meaningful equity
Hybrid working (3 days Sydney office, flexible WFH)
Latest MacBook Pro and a top setup
Two team retreats each year (Blue Mountains, SXSW, Singapore)
Office dogs for cuddles and interruptions
Bean to cup coffee machine, unlimited fruit and snacks. Toblerone on-tap
Senior Data Engineer at Red Hat designing and optimizing 데이터 솔루션 supporting sales and forecasting. Collaborating with teams and applying modern data engineering practices to ensure data quality.
Senior Data Engineer leading the design and implementation of data pipelines for NVIDIA’s analytics and monitoring systems. Collaborating across teams to enhance data ingestion and analysis capabilities.
Associate Data Engineer at Boeing India supporting API Development and Data migration with a focus on engineering and technology solutions. Involves working independently to gather requirements and supporting architecture for API services and data analytics.
Senior Data Engineer building and maintaining robust data pipelines for various data products at Beep Saúde. Collaborating within the team and leading data governance practices.
Software Developer in Test working on cloud - based data platform at Tecsys. Ensuring quality and reliability of data pipelines and transformations using automation frameworks.
Data Engineer responsible for designing, building, and optimizing data pipelines and architectures in a tech environment. Requires extensive experience with modern data warehousing and cloud platforms.
Lead Data Engineer role at Brillio focusing on AI & Data Engineering with expertise in Azure and MS Fabric. Collaborate within the Data Engineering team in Pune, Maharashtra, India.
Data Architect at Whiteshield designing scalable, secure data architectures for national and enterprise transformation programs. Architecting modern data platforms to support analytics, AI and operational use cases.
Data Engineer managing scalable data ecosystems for actionable business intelligence and cross - functional stakeholder collaboration. Optimizing ETL/ELT pipelines and ensuring data integrity and security.
Data Engineer specializing in data architecture and solutions for a banking environment, driving value for customers through innovative engineering practices and technologies in data management.