Designing, building, and operating data architecture on AWS for Bring! Labs. Leading data migration efforts and collaborating with product and operations teams.
Responsibilities
Design the target data architecture, establishing modeling patterns and transformation standards
Lead the migration of existing pipelines to dbt, improving and consolidating the current solution
Define, own and document data contracts between source systems and downstream consumers
Partner with product and operations teams to translate business needs into scalable data models
Build for self-service, enabling teams across the company to access and trust the data they need
Requirements
Strong data modeling expertise, translating business requirements into scalable data structures
Experience with modern data stack tools in the dbt area, like Databricks, Snowflake, or similar
Cloud data warehousing experience at internet scale, preferably on AWS
Data governance and security awareness, ownership, access control, lineage
BI tool experience at the architecture/administration level
Strong SQL skills for complex aggregations and a proficiency in Python
Understanding, experience, and interest in the possibilities of emerging AI tooling and practices in software engineering
Business fluent in English; German is an advantage
Nice to have:
Java or Scala experience (our current platform uses these)
Familiarity with Data Mesh or Data Fabric concepts
Experience with applying ML concepts in data platforms
Benefits
A young and rapidly evolving company that empowers employees to make decisions and actively shape our success
A modern and attractive working environment in the heart of Berlin (and additional offices in Zurich and Basel) with free barista-grade coffee
Flexible working hours with the option to work from the office, as well as partially from home
Social events that bring the team together, including twice-yearly company-wide get-togethers and regular team events, all covered by us!
A commitment to sustainability, including mostly traveling by public transport and providing a Bahncard 50 for your commute
Many cool perks, such as 25 days of vacation + a day off on your birthday, the latest hardware, home office subsidies, and much more!
Junior Data Engineer role at Allegro, focusing on developing ETL/ELT pipelines and processing large datasets. Collaborate with cross - functional teams for data quality and reporting.
Data Engineer at Concept Reply developing innovative data - driven solutions in IoT. Collaborating with teams to unlock the potential of data and cloud computing.
Data Engineer creating and managing data pipelines for critical data solutions at S&P Global. Collaborating on enterprise - scale data processing in a supportive, innovative environment.
Data Engineer supporting and evolving data environment in cloud migration. Maintain and optimize existing databases while designing modern data solutions with cross - functional collaboration.
Senior Data Engineer responsible for data pipeline projects at Suprema Gaming. Focus on batch and streaming data solutions while collaborating with business teams.
Senior data leader managing the enterprise data architecture at Breakthru Beverage. Leading high - performing teams in data engineering and defining modern data strategies.
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.
Data Warehouse Architect developing and optimizing robust data warehouse environments on SAP BW/4HANA. Critical for enabling advanced analytics and reporting across the organization.
Data Engineering Manager leading a new Data Engineering team in Bengaluru. Shaping the design and scaling of core data engineering practices across the organization.
Senior Google Data Architect designing and delivering scalable data solutions on Google Cloud Platform. Collaborating across teams to shape target - state data architectures and influence enterprise data strategy.