Expert Data Engineering L3 Support Engineer leading Data Warehouse development and support processes while managing the VOIS development team. Conducting root cause analyses and implementing sustainable solutions.
Responsibilities
As an Expert Data Engineering L3 Support Engineer (m/f/d), you independently implement key parts of the data warehouse development and support processes and provide leadership for them.
You manage the VOIS development team across various data warehouse platforms.
You perform root cause analyses, develop hotfixes, and incorporate the findings into the design of sustainable solutions.
You oversee development and operations tasks such as incident and problem management within a DevSecOps team.
Requirements
Successfully completed degree in (business) Computer Science, Data Science, Mathematics, Physics, or a comparable qualification.
4 years of practical experience in data warehouse development (SQL, Python, Ab Initio, Unix).
Certification as Google Cloud Professional Data Engineer (or an equivalent AWS certification).
Strong conceptual knowledge of different database types and ETL tools (BigQuery, Teradata, Ab Initio, dbt, Dataform).
Familiarity with telecommunications-specific data such as BSS (contracts, customers, products, services) and OSS (network data); excellent data literacy for fault/root-cause analysis.
Good written and spoken German and English (B2 CEFR).
Benefits
Flexibility: Hybrid working enables you to work from the office or up to 3 days per week from home.
Additionally, you can work for up to 20 days per year from other EU countries and in an increasing number of non-EU countries.
Attractive compensation and retirement benefits: As a collective-agreement employee, you receive either holiday and Christmas bonuses or a 13th salary depending on the pay model.
Company pension: We also offer a company pension plan as part of our retirement provision.
Professional development: You choose which of our learning and training programs support your individual development.
Work–life balance: From childcare and health & mindfulness programs to gym memberships — you can flexibly shape your work and personal life. We support you, including when caring for relatives.
Discounts and additional perks: Employees receive special offers on all our mobile, landline, internet, and TV products.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.