Working Student assisting in database development and data pipelines at ruhrdot GmbH. Engaging in team collaboration and learning agile methodologies.
Responsibilities
You will assist in gathering and analyzing customer requirements
You will help design and develop database structures, models, and data pipelines
You will work on automating and maintaining workflows to ensure our data and processes remain stable and error-free
You will assist in connecting and cleaning various data sources such as APIs, flat files, and databases
You will be actively involved in collaborating with team members and will become familiar with agile project methods such as prototyping, MVPs, and sprints
Requirements
You are currently enrolled in a STEM or business-related degree program or have a comparable qualification
You have already gained some practical experience (e.g., through internships or projects) in data engineering or related areas
You have a solid understanding of databases, data modeling, and processes such as ETL/ELT and workflow automation
You are proficient in at least one programming language (e.g., Python)
Bonus: You have an interest in technologies such as Big Data (Spark, Hadoop), Databricks, or cloud services (AWS, GCP, Azure), as well as in data visualization
Initial experience with agile methods (e.g., Scrum or Kanban) is an advantage
You are analytical, structured, reliable, and have a strong willingness to learn.
Benefits
Varied tasks in a dynamic, international environment
Exciting clients: from startups to DAX-listed corporations
Scope to shape your work: your ideas and solutions are valued
A corporate culture that promotes collaboration and personal responsibility
Work with the latest technologies and innovative tools
Secure job in a future-oriented industry with attractive compensation
Modern office with high-quality equipment for a comfortable workday and additional top-tier hardware such as premium laptops and smartphones (depending on position)
Flexible work-life balance through home office options
Attractive benefits such as training opportunities, team events, discounts, free coffee & soft drinks onsite, and much more...
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.