Senior Data Engineer at Reaktor working on data-intensive applications and scalable architectures. Collaborating with clients to develop data pipelines and solutions for various applications.
Responsibilities
Designing and developing scalable data pipelines (ETL/ELT) in both batch and stream processing settings.
Working on infrastructure, integrations and APIs for storing, transforming and delivering data to end users.
Working together with our clients and domain experts to figure out their data needs.
Collaborating with data scientists, machine learning engineers and designers to design end-to-end data products.
Creating and building an end-to-end data pipeline for gathering impression data and serving content recommendations for a media streaming platform; a semantic search engine for large amounts of textual data; a data lake with numerous sources of financial data; or a platform and data pipelines for running ML models and integrating results to different applications.
Requirements
Experience in hands-on data engineering projects.
A background in working with databases and SQL (such as PostgreSQL, DynamoDB, Redshift, BigQuery).
Experience in data platforms, data lakes or data warehouses (such as Databricks, Snowflake).
A solid programming foundation in Python (required), with additional experience in other languages such as Java or TypeScript being a plus. Proficiency in data-related libraries like PySpark and Pandas.
Experience in at least one cloud provider (AWS, Azure, GCP).
Understanding of infrastructure-as-code practices (such as Terraform, CloudFormation, CDK).
Consulting skills i.e. good communication skills and the ability to work with clients to figure out their data needs.
Fluency in English both written and spoken. Finnish is seen as an advantage.
General knowledge of modern data science and business intelligence tools and frameworks is seen as a plus.
Experience in data architectures and data modelling is a plus
Benefits
The ability to impact how you work. Together with the client, your team chooses the approach, technologies, and methodologies you think will work best in any given situation
A community with as much support as your heart desires
A team that’s not only experienced but considerate as well – they all want you to succeed
A sustainable work-life balance and support for your daily life outside of work. (e.g., free moving day, Reaktor car share, sick child care services, office space to use for your private events, etc.)
An opportunity to grow as a professional. In addition to the day-to-day work, we offer internal training courses, community events, and 15-minute coffee breaks to discuss hot topics in tech and design
A possibility to take part in more extended academy-like studies like Cloud Academy
300+ hobby clubs, from winter swimming and running to knitting and archery, that bring people together outside of (and sometimes inside) office hours. Many of these are supported by Reaktor
Product Owner driving ERP data migration initiatives for BioNTech’s global landscape. Leading effective data management and ensuring compliance with regulatory standards in a fast - paced environment.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.