About the role

  • Data Engineer responsible for building ETL workflows and understanding business data for AI. Collaborating with cross-functional teams to implement data-driven use cases.

Responsibilities

  • You build and maintain ETL workflows using tools like Airflow.
  • You develop and improve data-cleaning algorithms in Python.
  • You integrate structured data sources (e.g. ERP, CRM, product databases) into our platform.
  • You collaborate closely with AI and frontend teams to quickly implement new use cases.

Requirements

  • At least 2 years of experience building ETL pipelines, strong Python skills
  • Solid understanding of Business Intelligence, gained through work experience or academic projects
  • Motivation to take ownership and help shape structures
  • Experience in data modeling and ideally with cloud or DevOps tools (e.g. AWS, Docker, CI/CD)
  • Excellent English skills (German is a plus)

Benefits

  • Work in an ambitious AI tech startup with real-world impact
  • A high degree of creative freedom and influence on our data architecture
  • Flexibility due to part-time models (e.g. 60–80%) possible
  • Direct collaboration with an interdisciplinary team on equal footing
  • Hybrid work from different locations such as Merantix AI Campus in Berlin

Job title

Data Engineer

Job type

Experience level

JuniorMid level

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job