Data Engineer building modern Data Lake architecture on AWS and implementing scalable ETL/ELT pipelines. Collaborating across teams for analytics and reporting on gaming platforms.
Responsibilities
Build and continuously evolve a modern Data Lake architecture on AWS (S3, Glue, Athena, Iceberg).
Design and implement scalable ETL/ELT pipelines with Apache Airflow, dbt and Apache Spark for processing large volumes of data.
Design and implement a comprehensive game analytics and event tracking system for our gaming platform.
Implement comprehensive data quality checks using dbt tests and monitoring solutions to ensure data quality and availability.
Work closely with our teams to deliver high-performance data analyses and reports.
Optimize data queries and storage for cost-efficient and high-performance analytics workloads.
Automate data processing workflows using Infrastructure as Code (IaC).
Requirements
Degree in Computer Science or a comparable qualification.
At least 3–4 years of professional experience in data engineering or related fields.
Excellent knowledge of Python (pandas, pyarrow, boto3).
Solid experience with cloud platforms (preferably AWS) and modern data lake technologies.
Experience with message queues (RabbitMQ) and workflow orchestration (Airflow or similar tools).
Familiarity with columnar file formats (Parquet) and modern table formats (Apache Iceberg, Delta Lake).
Ideally experience with dbt, Spark, or comparable data transformation tools, and visualization tools such as Tableau or Metabase.
Good understanding of data modeling, partitioning strategies and performance optimization.
Strong openness to AI technologies and willingness to take responsibility for internal AI systems.
Ideally, experience in the gaming industry and an understanding of the specifics of gaming data and metrics.
Strong communication skills and empathy when interacting with diverse personalities.
High degree of self-organization and self-reflection for working in a hybrid, collaborative work model.
Can-do mentality and enthusiasm for driving initiatives forward.
Independent working style with quality as a success factor.
Very good German (C1) and good English.
Benefits
A values-driven, collaborative working environment.
Sustainable company growth.
Hybrid working model: maximum flexibility in terms of working hours and location.
Well-equipped office in Berlin-Mitte as a centrally located meeting place.
A warm welcome through an intensive, structured onboarding process.
Learning & development opportunities through the GAMOcademy.
Annual development review and targeted development measures.
Health care benefits – company health insurance, mental health platform, and bike leasing.
Associate Delivery Manager leading CDW and MDM solutions at Beghou Consulting, improving commercialization strategies for life sciences clients. Ensuring high - quality delivery of data solutions and mentoring junior staff.
Azure Data Engineer developing scalable data pipelines and collaborating with data teams for EU institutions. Working remotely from EU locations with a focus on high - quality data management.
Data Engineer/BI Analyst developing ETL processes and analyzing datasets at highly growing software solutions company in Athens. Focus on Azure Databricks and BI tools for data - driven insights.
Senior Data Engineer for iKnowHow S.A. designing scalable data pipelines and mentoring a team. Leading technical decisions and ensuring data quality in the software and robotics sector.
Vice President of Data Engineering leading the Reference Data Master team in transforming data strategies. Driving cloud - native solutions and modernizing party reference data management practices.
Cloud Data Engineer at BCBSNE using advanced technologies to enhance healthcare solutions. Collaborating with stakeholders and delivering scalable data pipelines with a focus on automation.
Data Engineer participating in a 1 - month intensive academy focused on Snowflake and Databricks. Opportunity to work on enterprise - level projects after training in Monterrey, Mexico.
Data Engineering professional enhancing skills through a specialized academy on Snowflake and Databricks. Program leads to permanent contract with a big client post - completion.
Senior Data Engineer responsible for data platform management in a Stockholm - based public transport company. Working with Azure and AWS tech in a hybrid role.
Data Migration Engineer creating and maintaining valuable datasets supporting health service technology team. Collaborate with cross - functional teams to ensure data integrity and project success.