Big Data Engineer building scalable data ingestion platforms at Allegro. Work with advanced data science and AI applications in a dynamic and collaborative environment.
Responsibilities
Build a highly scalable and fault-tolerant data ingestion for millions of Allegro customers
Process 5 billion clickstream events every day from all Allegro sites and mobile applications
Engage in projects based on practical applications of data science and AI
Collaborate within a team of experienced engineers organized into various specialized teams
Requirements
Programming in languages such as Scala or Java, Python
Strong understanding of distributed systems, data storage, and processing frameworks like dbt, Spark or Apache Beam
Knowledge of GCP (especially Dataflow and Composer) or other public cloud environments like Azure or AWS
Use good practices such as clean code, code review, TDD, CI/CD
Navigate efficiently within Unix/Linux systems
Positive attitude and team-working skills
Eager for personal development and keeping knowledge up to date
English at B2 level
Benefits
Flexible working hours in an office first model
Annual bonus depending on your annual assessment and the company's results
Well-located offices with fully equipped kitchens and bicycle parking facilities
Excellent working tools including height-adjustable desks and interactive conference rooms
A wide selection of varied benefits in a cafeteria plan
English classes paid for related to the specific nature of your job
Macbook Pro / Air or Dell with Windows depending on preference
High degree of autonomy in terms of organizing your team’s work
Team tourism, training budget, and an internal educational platform
Data Management professional at Kyndryl involved in creating innovative data solutions and ensuring the seamless operation of complex data systems. Collaborating with teams to transform requirements into scalable database solutions.
Software Engineer designing and developing scalable data processing applications on cloud infrastructure for Thomson Reuters. Collaborating with Data Analysts on AI - enabled solutions for data management and insight generation.
Manager of Data Platform overseeing AWS cloud infrastructure and Snowflake data warehouses for Thomson Reuters. Leading the design and implementation of data processing applications in a hybrid role located in Bengaluru.
Senior Data Engineer designing scalable data pipelines and solutions for Enterprise Data Lake at Thomson Reuters. Collaborating across teams to ensure efficient data ingestion and accessibility.
Senior Data Engineer at Technis developing scalable data pipelines and solutions for innovative connected spaces products. Collaborating within a cross - functional team to deliver high - quality data - driven outcomes.
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.
Senior Data Engineer designing and implementing the Enterprise Data Platform at Stellix. Focusing on analytics and insights with a growth path to Principal Data Engineer or Data Architect.