Data Engineer at Voodoo optimizing real-time data pipelines for gaming and consumer apps to support growth. Joining a top-tier data team dedicated to monetizing via advertising partners in a competitive landscape.
Responsibilities
Build, maintain, and optimize real-time data pipelines to process bid requests, impressions, clicks, and user engagement data.
Develop scalable solutions using tools like Apache Flink, Spark Structured Streaming, or similar stream processing frameworks.
Collaborate with backend engineers to integrate OpenRTB signals into our data pipelines and ensure smooth data flow across systems.
Ensure data pipelines handle high-throughput, low-latency, and fault-tolerant processing in real-time.
Write clean, well-documented code in Java, Scala, or Python for distributed systems.
Work with cloud-native messaging and event platforms such as GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka to ensure reliable message delivery.
Assist in the management and evolution of event schemas (Protobuf, Avro), including data consistency and versioning.
Implement monitoring, logging, and alerting for streaming workloads to ensure data integrity and system health.
Continuously improve data infrastructure for better performance, cost-efficiency, and scalability.
Requirements
3-5+ years of experience in data engineering, with a strong focus on real-time streaming systems.
Familiarity with stream processing tools like Apache Flink, Spark Structured Streaming, Beam, or similar frameworks.
Solid programming experience in Java, Scala, or Python, especially in distributed or event-driven systems.
Experience working with event streaming and messaging platforms like GCP Pub/Sub, AWS Kinesis, Apache Pulsar, or Kafka.
Hands-on knowledge of event schema management, including tools like Avro or Protobuf.
Understanding of real-time data pipelines, with experience handling large volumes of event-driven data.
Comfortable working in Kubernetes for deploying and managing data processing workloads in cloud environments (AWS, GCP, etc.).
Exposure to CI/CD workflows and infrastructure-as-code tools such as Terraform, Docker, and Helm.
Benefits
Competitive salary upon experience
Comprehensive relocation package (including visa support)
Swile Lunch voucher
Gymlib (100% borne by Voodoo)
Premium healthcare coverage SideCare, for your family is 100% borne by Voodoo
Child day care facilities (Les Petits Chaperons rouges)
Data Architect leading design and implementation of cloud data platforms for digital transformation. Collaborating with stakeholders to define data strategies and governance models.
Data Engineer Consultant designing and optimizing data infrastructure for clients' business needs. Working with SQL and data visualization tools in a mainly remote role with some onsite responsibilities in Denver.
Data Engineer creating Real - Time Data Processing applications for a leading iGaming operator. Work involves stream data manipulation and collaboration in an Agile environment.
Cloud Data Engineer designing data architectures for cloud platforms at fifty - five. Collaborating with local and global teams to optimize marketing ROI and customer experience.
SAP Specialist responsible for designing, developing, and executing data migration objects in Hydro’s SAPEX program. Ensuring successful ETL processes and maintaining data quality.
Senior Data Engineer building scalable data pipelines and data models within retail at Avaron. Collaborating closely with business and technical teams to ensure reliable data solutions.
Senior Data Engineer building and operating the data platform at bsport. Collaborating with the Data team to optimize data intake and accessibility for analytics and AI.
Data Engineer building and maintaining Azure data platforms for Hultafors Group's analytics and reporting needs. Collaborating across various business functions in a cloud environment.
Lead Data Pipeline Manager at Valpak, overseeing data pipelines for environmental compliance initiatives. Collaborate with teams to ensure data quality and operational performance.
Data Engineer role responsible for building scalable data pipelines and systems at Consort Group in Portugal. Involves data engineering and regulatory reporting across diverse technical environments.