Senior Data Engineer responsible for designing, building, and deploying data solutions at CI&T. Collaborating on technical engagements and ensuring adherence to security best practices.
Responsibilities
Contribute to technical engagements and proactively identify opportunities to expand CI&T's business with clients.
Collaborate on multiple projects across various domains, providing diverse subject matter expertise.
Participate in technical design and architecture discussions to ensure robust solutions.
Ensure adherence to security and performance best practices in all solutions.
Create and maintain technical documentation required by clients.
Communicate effectively with customer team members to ensure alignment and clarity of technical solutions.
Understand client requirements and develop viable technical solutions by selecting appropriate frameworks.
Guarantee a stable and productive development environment for all team members.
Requirements
6+ years of experience in the software development area
Proven expertise in Data Warehousing and Data Analytics projects, encompassing data acquisition, transformation, and data science initiatives.
Exposure with modern data file formats such as Delta Tables, Apache Iceberg, and Parquet.
Proficiency in Python, Spark, DBT, and other data transformation tools and frameworks.
Expertise in data modelling, with familiarity in conceptual, logical, and physical data models, following methodologies like Data Vault 2.0 and Dimensional Modelling.
Experience with cloud data services such as Microsoft Intelligent Data Platform, AWS Data Services, or Google Cloud Data Analytics.
Minimum of 3 years experience with Snowflake
Hands-on experience with cloud data platforms like Databricks and/or Snowflake; certifications.
Skilled in data pipeline orchestration tools, including Azure Data Factory, AWS Glue, Apache Airflow, or Prefect.
Experience with CI/CD and MLOps pipelines to streamline development and deployment processes.
Proficient in Infrastructure as Code (IaC) for efficient infrastructure management.
Expertise in data migration, performance tuning, and optimization of databases and SQL queries.
Understanding of Data Governance and Data Management concepts.
Understanding design patterns, clean architecture, and clean coding principles.
Familiarity with unit, integration, and E2E testing.
Data Engineer/Analyst maintaining and improving data infrastructure for Braiins. Collaborating with technical and business teams to ensure reliable data flows and insights.
Medior Data Engineer handling Azure migrations for a major urban mobility client. Focused on data pipeline development and ensuring platform reliability with cutting - edge technologies.
Developing ML and computer vision solutions for cutting - edge autonomous vehicle dataset pipeline at Mobileye. Collaborating across teams for data curation and advanced perception algorithms.
Data Migration Lead in a hybrid role managing data migration for a major transformation programme in the media sector. Collaborating with various teams to ensure data integrity and successful migration.
Consultant ML & DataOps at Smile integrating data science projects for major clients. Designing MLOps solutions and enhancing data governance in a collaborative environment.
Data Engineer developing and maintaining data pipelines for Coolbet’s analytical services. Working within an Agile framework to ensure data reliability and efficiency.
API Data Engineer developing innovative data - driven solutions and advancing data architecture for AI Control Tower. Building and integrating APIs and data pipelines to support organizational needs.
Journeyman Data Architect supporting Leidos' enterprise data and analytics program for the Department of War. Collaborating on solutions for data architecture, cloud environments, and governance.
Senior Software Engineer developing backend services and data infrastructure for integrated products at Booz Allen. Collaborating with a small elite team to deliver reliable and scalable services.
AWS Streaming Data Engineer developing software and systems in a fast, agile environment. Utilizing experience with real - time data ingestion and processing systems across distributed environments.