Senior Data Engineer at Reos responsible for scalable ETL pipelines using Microsoft Fabric. Focused on data integration from various sources and data modeling processes.
Responsibilities
Architect and implement scalable ETL pipelines using Microsoft Fabric.
Ensure efficient data integration from various sources, including MySQL databases, APIs, and others.
Design and maintain the data warehouse schema to support the company’s evolving requirements.
Develop processes for data modeling, ingestion, and production.
Ensure data integrity, availability, and confidentiality across the entire pipeline.
Stay up to date with industry trends and advances in data engineering and recommend the integration of new technologies.
Requirements
A completed Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field forms the basis of your technical expertise.
At least 5 years of professional experience, giving you a deep understanding of developing and implementing modern ETL pipelines.
Strong knowledge of Microsoft Fabric or Microsoft Data Factory, Apache Spark, and notebooks characterizes your technical skill set.
You have extensive experience with MySQL and are familiar with various databases and data warehousing solutions.
A proven track record in batch processing and working with structured data (ideally in the real estate sector) completes your profile.
Enthusiasm for AI and automation, and motivation to further develop skills in Power BI, are also part of your profile.
Fluent German and very good English skills enable you to communicate confidently with all stakeholders.
Benefits
Remote Work: Flexible working model – work up to 100% remotely within Germany or use our modern office in Hamburg as needed.
Workation: A work environment with a holiday feel – enjoy the perfect blend of productive work and a relaxed atmosphere within Europe.
Wellbeing: Corporate health management – an annual health budget and a subsidized Urban Sports Club membership.
Retirement Provision: Attractive employer contributions to company pension plans – secure your future with our support.
Hardware: State-of-the-art equipment – so we can continue to innovate together.
Discounts: Attractive offers – enjoy discounts from over 1,500 providers across sports, mobility, fashion, and travel, and use our Edenred benefits card for additional perks.
Knowledge Base: Targeted and on-the-job – seize the opportunity for diverse, individual development and training options.
Networking: Not without my team – look forward to regular team events.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.