Data Engineer optimizing PostgreSQL and OpenSearch platforms at Deutschlandradio. Supporting media research with data analysis and maintaining high data quality in projects.
Responsibilities
Further development and optimization of our on-premise data platform (PostgreSQL, OpenSearch) for the analysis of podcast logfiles in particular
Maintenance and improvement of ETL/ELT pipelines (Airflow) for processing usage data from various sources
Maintenance of the CI/CD pipeline (GitLab) and deployment processes for automated data processing and dashboard updates
Technical support of our tracking solution (Piano Analytics) and ensuring high data quality for analyses
Provision of reliable data for the media research team and editorial departments, and support with technical questions regarding dashboards and reports
Further development of monitoring and alerting and ensuring GDPR compliance across all data processes
Requirements
Degree in Computer Science, Mathematics, Data Science or a comparable professional background with several years of experience as a Data Engineer
Strong programming skills in Python and solid SQL skills, ideally with practical experience in PostgreSQL
Experience with Airflow for orchestrating ETL/ELT pipelines and familiarity with GitLab CI/CD, Docker and Linux environments
Experience in logfile analysis and processing, and ideally knowledge of OpenSearch, Elasticsearch or similar search engines
Basic understanding of Machine Learning and NLP for future projects is desirable, as well as cloud experience (GCP, AWS)
Strong communication skills, enjoyment of independent work and the ability to convey technical matters clearly
Good written and spoken German (C1 level or higher)
Benefits
Exciting and creative tasks in a dynamic and motivated team
Work at a public-service broadcaster that follows the guiding principle of acting "confidently, respectfully and curiously" both internally and externally
The opportunity to contribute your own ideas and suggestions, as well as a variety of development and training opportunities
Flexible home office and mobile working options
Family-friendly employer and childcare during the summer holidays
Company pension scheme and other attractive collective-agreement social benefits
Discounted sports offerings through our corporate sports program and a partnership with Urban Sports Club
Good public transport connections and a monthly subsidy for the Deutschlandticket (nationwide public transport pass)
Meal options available at both Cologne and Berlin locations
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.