Senior Data Engineer at Aroundhome working on a data platform for empowering house owners. Collaborating with stakeholders to develop scalable data solutions and AI-integrated products.
Responsibilities
With a mission to empower 15.6 million house owners across Germany and the broader DACH region, we at Aroundhome are building a platform where every strategic decision is powered by data.
Design necessary data models and transformations to curate raw data.
Develop, optimize and maintain existing data models, pipelines, and transformations to support analytics, reporting, and AI use cases.
Architect and contribute in implementing a scalable, modern data platform, including data lakehouse or warehouse, to support real-time data flows.
Build and optimize real-time data pipelines using Kafka, Spark, and Delta Live Tables.
Support the team lead in establishing and enforcing data governance frameworks.
Collaborate with the team to integrate AI/ML capabilities and operationalize ML models as well as integrate AI into business processes.
Mentor the team on best practices, modern tools, and cloud-native scalability.
Requirements
Master's degree in Computer Science, Data Engineering, or related field (or equivalent experience)
10+ years of experience in data engineering, with 5+ years in senior roles focused on modern architectures.
Excellent communication and collaboration skills, the ability to drive change and influence stakeholders, and a passion for mentoring, coaching, and sharing knowledge
Proven expertise in designing, developing & maintaining data lakehouses/DWH (e.g., Databricks Delta Lake, Snowflake) and transformations (e.g., DBT, SQL/Python/Spark).
Strong experience with cloud platforms such as AWS services (S3, Athena, MSK/Kafka, Terraform) and real-time streaming (e.g., Kafka, Spark Structured Streaming, Flink).
Hands-on knowledge of data governance tools (e.g., Unity Catalog, Collibra) for lineage, quality, catalogs, and SSOT.
Familiarity in AI/ML pipelines and MLOps (e.g., MLflow, feature stores) and complex system integration within modern data technologies.
Proficiency in CI/CD for data, and tools like Git, Airflow, or dbt Cloud.
Experience with large-scale data modeling (DataVault, dimensional, schema-on-read) and optimizing for self-service analytics.
Benefits
Hybrid Work // Stay flexible: Decide freely whether you want to work from our office in Berlin, from home throughout Germany, or fully remote from Portugal. Additionally, we offer you the possibility to work up to 30 days per year from selected EU countries.
Room to grow // Stay curious: In regular/annual feedback meetings, you design your own personal development path together with your team lead. In addition, we offer monthly Tech Talks, an annual training budget, free access to LinkedIn Learning as well as great workshops to seek new input and specific training for your personal development.
Mental & physical health // Stay healthy & active: We care about your well-being and support your health with a discounted membership at Urban Sports Club. In addition, our cooperation partner “Fürstenberg Institute” is available to you and your family members with advice and support when it comes to mental health & coaching.
Taking responsibility // We #care: We care for our planet and the people who are living on it. To make our world a little bit better, our ESG Team (Environmental, Social & Governance) promotes our awareness for environmentally friendly behavior. In addition, we work with JobRad, offer a discounted BVG ticket, and actively support social projects in Berlin at our annual Social Day to give something back.
Teamspirit // Stay connected: What drives us is our sense of community. That's why we celebrate our Company Day once a month in our office at Potsdamer Platz, where everyone gets together in person. In addition to a comprehensive update of our C-Board, the main focus is on togetherness and the exchange of ideas over lunch and dinner together.
Snowflake Data Engineer responsible for data pipelines and warehouses for enterprise analytics at Liberty Coca - Cola. Collaborating across business functions to ensure high data quality and performance.
Full - Stack Data Engineer designing and optimizing complex data solutions for automotive content. Collaborating with teams to enhance user experience across MOTOR's product lines.
Principal Data Engineer designing and evolving enterprise data platform. Collaborating with analytics teams to enable AI and data products at American Tower.
BI Data Engineer II supporting scalable Lakehouse data pipelines at Boston Beer Company. Collaborating with stakeholders to drive data ingestion and maintain enterprise data quality.
Senior Data Engineer at A Kube Inc responsible for building and maintaining data pipelines for product performance. Collaborating with product, engineering, and analytics teams to ensure data quality and efficiency.
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.