Designing, building, and operating data architecture on AWS for Bring! Labs. Leading data migration efforts and collaborating with product and operations teams.
Responsibilities
Design the target data architecture, establishing modeling patterns and transformation standards
Lead the migration of existing pipelines to dbt, improving and consolidating the current solution
Define, own and document data contracts between source systems and downstream consumers
Partner with product and operations teams to translate business needs into scalable data models
Build for self-service, enabling teams across the company to access and trust the data they need
Requirements
Strong data modeling expertise, translating business requirements into scalable data structures
Experience with modern data stack tools in the dbt area, like Databricks, Snowflake, or similar
Cloud data warehousing experience at internet scale, preferably on AWS
Data governance and security awareness, ownership, access control, lineage
BI tool experience at the architecture/administration level
Strong SQL skills for complex aggregations and a proficiency in Python
Understanding, experience, and interest in the possibilities of emerging AI tooling and practices in software engineering
Business fluent in English; German is an advantage
Nice to have:
Java or Scala experience (our current platform uses these)
Familiarity with Data Mesh or Data Fabric concepts
Experience with applying ML concepts in data platforms
Benefits
A young and rapidly evolving company that empowers employees to make decisions and actively shape our success
A modern and attractive working environment in the heart of Berlin (and additional offices in Zurich and Basel) with free barista-grade coffee
Flexible working hours with the option to work from the office, as well as partially from home
Social events that bring the team together, including twice-yearly company-wide get-togethers and regular team events, all covered by us!
A commitment to sustainability, including mostly traveling by public transport and providing a Bahncard 50 for your commute
Many cool perks, such as 25 days of vacation + a day off on your birthday, the latest hardware, home office subsidies, and much more!
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.