Data Engineer role at Haeger Consulting focused on developing and processing data solutions. Responsible for building scalable ETL/ELT data pipelines and ensuring data quality.
Responsibilities
Develop and implement data solutions for a variety of clients
Build scalable ETL/ELT data pipelines for batch and streaming processing
Implement data models following Kimball, Data Vault, or Lakehouse principles
Transform, normalize, and denormalize data from various source systems
Orchestrate data pipelines using Airflow or similar tools
Support data analytics and machine learning pipelines
Ensure quality through testing, documentation, and CI/CD
Requirements
Several years of professional experience in the areas listed above
Strong SQL skills
Strong knowledge of Spark / PySpark
Proficient in Python
Experience with dbt and Airflow or comparable orchestration tools
Experienced with Snowflake and/or Databricks
Experienced in data modeling (dimensional modeling, Data Vault, normalization & denormalization)
Analytical mindset and enjoyment of solving complex data problems
Self-organized working style and the ability to maintain oversight across multiple client projects
Interest in consulting and varied, engaging projects
Open and constructive communication and an appreciation for candid feedback
Benefits
Innovative projects and a creative environment: Work on exciting client engagements in a creative and dynamic setting with plenty of room for your own ideas and personal development.
Agility: Join a team that works agilely and flexibly to consistently deliver the best results.
Flexibility: Shape your workday according to your needs — with options for remote work and flexible hours.
Scope for influence: You have the opportunity to actively shape and expand our data practice.
Knowledge sharing & growth – Benefit from intensive internal know-how transfer and the opportunity to actively contribute to the team's development — both professionally and personally.
Team spirit: In our open and collegial environment, teamwork is paramount — we believe the best ideas emerge through collaboration.
We value diversity and promote a culture of inclusion. Everyone is welcome!
Attractive perks: You will receive a competitive salary, various benefits, and fun employee events.
Associate Data Engineer supporting enterprise data initiatives for big data projects at CVS Health. Ensuring the quality, reliability, and security of data solutions on Snowflake and Google Cloud Platform.
Data Engineer responsible for building data pipelines and improving data quality at Snap Inc. Collaborating with stakeholders across various functions to ensure timely dataset availability.
Senior Data Engineer automating cleaning and analysis of data for clinical trials at Statistics & Data Corporation. Engaging in AI initiatives to enhance data processing and insights.
Vice President of Strategic Data Architecture overseeing data architecture for a group of companies. Defining, governing, and advancing the data ecosystem to align with business priorities.
Data Engineer role in São Paulo developing data pipelines and maintaining data architecture. Collaborative role with a focus on data governance and analytics.
Data Engineer designing automated data pipelines for a leading recycling and sustainability company. Collaborating with IT and operations to streamline workflows and increase efficiency.
Data Warehouse Engineer developing end - to - end solutions for projects at TD Bank. Leading technical design and delivery of effective tech solutions.
Building, optimizing and supporting Beghou’s AI data platform in cloud environments using Databricks and Python. Requires hands - on development and adherence to software engineering best practices.
People Manager leading data engineering teams for Kramp, fostering growth in a tech environment. Overseeing performance and collaboration within a dynamic digital landscape.
Senior Data Engineer for Semrush, developing scalable data pipelines and optimizing data systems. Collaborating with teams for analytics and mentoring junior engineers in best practices.