Engineer building and operating data systems for Spotify's marketing initiatives. Contributing to data pipelines and integrations supporting global campaigns like Wrapped.
Responsibilities
build and operate data pipelines that power paid media attribution, audience targeting, and campaign measurement across global marketing platforms
build and operate batch and real-time data pipelines on Google Cloud Platform that process billions of marketing and user events daily at global scale
partner with marketing, analytics, and product teams to instrument data collection, translate requirements, and ensure data quality while operating within legal and regulatory constraints
contribute to integrations with external marketing platforms, supporting reliable audience delivery and suppression workflows
monitor and improve production systems, strengthening reliability, observability, and operational readiness
work within a cross-functional, agile squad to build and own data systems end to end, partnering with marketing, analytics, and insights teams to create solutions aligned with business priorities
collaborate with fellow engineers to design, build, and operate data systems, while contributing to shared standards, tooling, and continuous improvements in how the team works
Requirements
solid experience building and maintaining production data pipelines and systems
familiar with development and deployment within cloud environments (ideally GCP)
fluent in SQL and at least one programming language (Python, Java, or Scala)
experience using cloud data warehouses like BigQuery, Snowflake, or Redshift
understand batch and streaming data processing, ETL/ELT patterns, and data modeling fundamentals
care deeply about data quality, system reliability, and building systems downstream users can trust
value strong engineering practices, including testing, monitoring, and maintainable code
comfortable incorporating GenAI tools like Claude into your development workflow
comfortable working in an agile, product-oriented environment and collaborating across engineering, product, analytics, and marketing
pragmatic, hands-on, and able to balance delivery speed with long-term quality
thrive in ambiguous and fast-changing environments, and know how to make progress even when requirements are evolving
experience with marketing technology ecosystems (attribution platforms, CDPs, ad platforms) is a bonus.
Quant Data Engineer developing Oracle PL/SQL based solutions for risk analytics at Asset Management Technology. Collaborating with teams to optimize database performance and ensuring data quality.
Data Engineer 1 engaging with multiple departments to transform data into actionable information. Gathers requirements and supports the data warehouse.
Data Engineer at StarRez designing and maintaining data pipelines for analytics. Collaborating with teams to drive data quality and insights across the product.
Digital Analytics Capability - Adobe Data Engineer helping Bankwest with analytical foundations for digital experiences. Implementing and maintaining Adobe Experience Cloud applications for customer engagement.
AWS Data Architect overseeing enterprise data platform architecture for Signet Jewelers. Guiding engineering teams and ensuring data solutions are reliable and aligned with enterprise strategy.
MDM Data Engineer managing Profisee MDM platform and ensuring data quality in enterprise systems at Pacific Life. Collaborating with data stewards and integrating with upstream and downstream systems.
Senior/Lead Data Engineer at HOLYWATER TECH managing infrastructure for analytical platforms like BigQuery and data integration. Involves collaborations with Data Product Owners and significant engineering responsibilities.
Process Mining Data Engineer implementing Celonis across business units at LSEG. Collaborating with executives and teams to optimize operations and drive business outcomes.
Senior Data Engineer focusing on Retrieval - Augmented Generation (RAG) and AI solutions at LexisNexis. Collaborating with teams to integrate AI into existing systems and optimizing models for performance.