Data Engineer building scalable data platforms using Databricks and Azure for Reconomy. Collaborating across teams to deliver high-quality data solutions.
Responsibilities
Design and build scalable, reliable data platform solutions using Databricks, Spark, and Azure Cloud technologies.
Design, implement and optimize data pipelines, ensuring high performance, data quality, and alignment with business requirements.
Ensure the data platform enables easy access to high-quality, secure, and compliant data for all stakeholders, fostering a self-service analytics environment.
Work closely with other squads, data scientists, analysts, and product teams to understand requirements and deliver data solutions that support their needs.
Partner with engineers across teams to ensure consistency in data models, pipelines, and platform usage.
Communicate technical decisions and trade-offs clearly to both technical and non-technical stakeholders.
Contribute to the design of scalable, secure data architectures that support the company's data products and services.
Champion and implement best practices for data governance, security, and compliance within the codebase and pipelines.
Stay current with industry trends and emerging technologies and bring new ideas to improve the team's data engineering capabilities.
Requirements
Proven hands-on experience with **Spark, either PySpark or Spark with Scala**
Solid understanding of **Delta Lake** architecture and best practices
Expertise in building and scaling data platforms, with a strong focus on **Databricks** and **Cloud** technologies, preferably Azure Cloud
Working experience with Python
SQL knowledge
Proven experience contributing to the design and implementation of data platform solutions and delivering large-scale data infrastructure projects using Databricks and Azure
Experience with NoSQL Database architecture and data modeling (e.g. CosmosDB)
Solid understanding of data architecture, data modeling, and distributed systems
Benefits
This role offers you the chance to work in a friendly, diverse and international environment, along with colleagues who will share your passion for innovation, agile-working and growth. You will also be able to develop your skills within the exciting and challenging market of Reverse Logistics!
Hybrid working environment
Training and development to keep you in touch with the latest technologies and the opportunities to apply your learning.
We offer a competitive salary alongside other benefits*
Our office is easily accessible located near the city center of Bucharest, and designed to make you feel at home
21 working days of annual leave, plus 2 additional days allowed for participation in volunteering programs, and 1 extra day off for your birthday
Meal vouchers: 40 RON per working day (taxed according to current legislation)
Data Engineer focused on analytics and data pipeline development for network optimisation. Collaborating with teams to deliver high - quality data solutions with Python and SQL.
Senior Product Manager defining platform capabilities for Data Cloud in Salesforce. Collaborating with R&D teams while shaping product strategy for Data 360 integration.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.
Data Engineering Associate focusing on data quality control and management for distribution platform. Collaborates on large scale data projects to ensure data accuracy and availability for users.
Data Architect managing enterprise data platform built on Microsoft Fabric at Johnstone Supply. Leading architectural standards and collaborating with business and IT leaders for strategic data - driven insights.
Data Engineer at Studyportals responsible for data pipelines and infrastructure. Join a team ensuring accurate and trustworthy data for analytics and business decisions.
AI/ML Engineer designing and refining prompts and workflows using large language models. Responsible for developing data pipelines and delivering scalable AI solutions in a hybrid work environment.
AWS Data Architect at Fractal designing and operationalizing AWS data solutions at enterprise scale. Collaborating with clients and mentoring engineers in best practices.