Data Engineer developing automated data pipelines for Creditreform. Collaborating with Data Scientists and Analysts and ensuring data quality and consistency.
Responsibilities
Design, implement and operate fully automated data pipelines (ETL, ELT, etc.) for efficient processing of large volumes of structured and unstructured data using Azure Databricks
Connect various internal (e.g. databases, APIs) and external data sources (including APIs, web crawlers)
Ensure technical and operational data quality and consistency through appropriate validation and cleansing processes
Monitor running data pipelines and processes and resolve any operational errors
Work closely with our Data Scientists, Data Analysts, Data Engineers and BI developers
Document work results in a structured, audience-appropriate manner
Requirements
Degree in Computer Science, Data Engineering, a natural science or a comparable field
Initial professional experience in Data Engineering
Knowledge of developing and optimizing data pipelines (ETL, ELT), ideally with Spark and Databricks
Experience with Python, SQL, Spark, etc., with a clear focus on Data Engineering
Solid basic understanding of connecting databases, APIs, etc. to a data platform
Implementation and adherence to high data protection and security standards in a regulated environment
Drive to continuously learn and proactively develop professionally
Very good German and good English skills, both written and spoken
Benefits
30 days of vacation + Rosenmontag (Carnival Monday) off
Flexible working hours and the option to work remotely up to 2 days per week
Internal complimentary courses and ongoing training and development
Occupational health promotion, sports and recreation programs (location Neuss)
Support for company pension scheme
Family-friendly policies, private supplementary insurance also available for dependents
Opportunity to study while working
Flat-hierarchy communication and regular feedback discussions
After-work events as well as summer and Christmas parties (location Neuss)
Employee offers via Corporate Benefits or other employee discounts for items such as electronics, travel or housing
Fully equipped kitchen with coffee, tea, water and fresh fruit
Subsidized delivered lunch and a snack vending machine offering sweet, healthy or hot snacks (location Neuss)
Covered free parking and lockable bicycle parking (location Neuss)
Good public transport connections
Option for a JobRad (bike leasing) and a discounted Deutschlandticket
Associate Data Engineer undergoing training in data engineering tools and collaborating with senior engineers on projects. Hands - on experience with real - world datasets and cloud - based platforms.
Data Engineer developing data infrastructures and pipelines for Motive's Enterprise clients. Collaborating across teams to enhance data processing with modern tools and frameworks.
Senior Data Engineer writing production code and maintaining data pipelines for commercial banking insights. Collaborating with other teams to deliver powerful data solutions.
Lead Data Engineer responsible for designing and implementing data solutions. Work with AWS and Snowflake in a global data and AI company headquartered in New York.
Senior Data Engineer developing ingestion, ETL & ELT processes with Azure Synapse at Indra Group. Supporting data solutions and compliance in a collaborative hybrid work environment.
Senior Data Engineer building data pipelines for business intelligence at Getsafe. Involves designing and maintaining a Data Warehouse platform using tools like SQL and Python in a hybrid work environment.
Big Data Architect involved in the Data Engineering Team, drafting architectures and sizing efforts. Collaborating with key team members to ensure data platform efficiency and best practices.
Data Engineer building and maintaining core data infrastructure for Aircall's AI customer communications platform. Support data reliability and scalability for high - impact use cases.
Senior Data Engineer shaping data landscape and leading ETL/ELT solutions for mission - critical engineering projects in the UK. Collaborating with teams to deliver high - quality data products.