Data Warehouse Developer responsible for developing and maintaining an open-source Data Lakehouse at Uni Systems. Involves designing data pipelines and ensuring data quality across multiple sources.
Responsibilities
Development and maintenance of a fully open-source Data Lakehouse.
Design and development of data pipelines for scalable and reliable data workflows to transform extensive quantities of both structured and unstructured data.
Data integration from various sources, including databases, APIs, data streaming services and cloud data platforms.
Optimisation of queries and workflows for increased performance and enhanced efficiency.
Writing modular, testable and production-grade code.
Ensuring data quality through monitoring, validation and data quality checks, maintaining accuracy and consistency across the data platform.
Elaboration of test programs.
Document processes comprehensively to ensure seamless data pipeline management and troubleshooting.
Assistance with deployment and configuration of the system.
Participation in meetings with other project teams.
Requirements
Bachelor degree in IT or related field and 13 years of professional experience in IT.
Excellent knowledge of data warehouse and/or data lakehouse design & architecture.
Excellent knowledge of open-source, code-based data transformation tools such as dbt, Spark and Trino.
Excellent knowledge of SQL.
Good knowledge of Python.
Good knowledge of open-source orchestration tools such as Airflow, Dagster or Luigi.
Experience with AI-powered assistants like Amazon Q that can streamline data engineering processes.
Good knowledge of relational database systems.
Good knowledge of event streaming platforms and message brokers like Kafka and RabbitMQ.
Extensive experience in creating end-to-end data pipelines and the ELT framework.
Understanding of the principles behind storage protocols like Apache Iceberg or Delta Lake.
Proficiency with Kubernetes and Docker/Podman.
Good knowledge of data modelling tools.
Good knowledge of online analytical data processing (OLAP) and data mining tools.
Fluent in English at least at a level C1.
Benefits
At Uni Systems, we are providing equal employment opportunities and banning any form of discrimination on grounds of gender, religion, race, color, nationality, disability, social class, political beliefs, age, marital status, sexual orientation or any other characteristics.
Take a look at our Diversity, Equality & Inclusion Policy for more information.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.