Data Engineer designing and implementing data solutions for DATAIS consultancy in a young and friendly environment. Focusing on modern data products using advanced technologies and frameworks.
Responsibilities
Design and implementation of solutions processing large and unstructured datasets (Data Lake Architecture, Streaming Architecture)
Design, build and maintain big data architecture and data pipeline
Develop and implement CI/CD pipeline automation solutions
Implementation, optimization and testing of modern DWH/Big Data solutions based on Azure cloud, GCP or AWS platform and Continuous Delivery / Continuous Integration environment
Data processing efficiency improvement, migrations from on-prem to public cloud platforms
Build relationships with client stakeholders to establish a high level of rapport and confidence
Work with clients and teams to deliver modern data products
Requirements
3+ years of experience in Data Engineering field
Degree in Engineering, Computer Science, or other scientific fields
Data Enthusiast: Driven by a passion for data and technology.
ETL Maestro: Skilled in orchestrating ETL processes with precision and finesse.
Spark Wizard: Adept at conjuring data cleaning and transformation magic in a Spark/Scala environment using Python.
SQL Sage: Master of SQL and the Apache Spark realm.
OLAP Explorer: Navigates OLAP architectures with ease.
Cloud Commander: Technically savvy in architecting and managing cloud-based solutions.
DevOps and Kubernetes explorer: Loves working collaboratively and follows best practices in Git and DevOps and best practices.
Data Alchemist: Comfortable with schema-on-read databases like Redis.
Meticulous Organizer: Possesses an eye for detail and organizational finesse.
Independent Trailblazer: Thrives when working proactively and autonomously.
Problem-Solving Maestro: Solves challenges through information gathering, thoughtful evaluation, and ingenious solutions.
Greate communicator, both verbal and written
Pressure Player: Thrives under pressure, even within complex organizational landscapes.
English Pro: Proficiency in English is non-negotiable.
Benefits
Competitive salary
Semi-annual bonus on target achievement
Flexible Benefits
Remote or Hybrid work
Flexible working hours
Wellness days
Friendly and challenging working environment
Lifelong learning mindset with budget for certifications, training and personal development
Be protagonist of disruptive tech events in collaboration with several tech communities, scaleups, Universities and Bootcamps
Data Engineer at CBTW handling data pipelines and ETL processes using SAS. Collaborating with business stakeholders and ensuring data governance within SAS environments.
Data Engineer I at Catalyst Brands developing and optimizing data pipelines for cross - functional teams. Designing next generation data platform architecture to meet increasing data demands in a retail environment.
Data Engineer at Grupo Iter responsible for data pipelines and architecture in Azure. Collaborating on data governance and integrating analytics with Power BI.
Full Stack Data Architect for Concurrency designing Azure data - intensive applications. Leading complex data architecture initiatives and mentoring engineering teams in a high - performance environment.
AHEAD builds digital business platforms; seeking a Data Engineer in a development program. Join us to grow into a technical leader emphasizing skills across various practices.
Data Engineer creating clean, reliable data pipelines for Plenti, a fintech lender. Collaborating with modern tools like AWS and Databricks to enhance data quality and analytics.
Data Platform Specialist overseeing data quality and platform operations at Stackgini. Collaborating with teams to enhance data management solutions and improve system performance.
Staff Data Engineer at PPRO transforming data ecosystem into a self - service platform. Leading technical vision for data engineering and building scalable infrastructures.
SSIS Data Engineer at iKnowHow Group focusing on data migration projects. Involves data modeling, integration, and using T - SQL/SQL alongside SSIS packages.