Senior Data Engineer responsible for designing, building, and deploying data solutions at CI&T. Collaborating on technical engagements and ensuring adherence to security best practices.
Responsibilities
Contribute to technical engagements and proactively identify opportunities to expand CI&T's business with clients.
Collaborate on multiple projects across various domains, providing diverse subject matter expertise.
Participate in technical design and architecture discussions to ensure robust solutions.
Ensure adherence to security and performance best practices in all solutions.
Create and maintain technical documentation required by clients.
Communicate effectively with customer team members to ensure alignment and clarity of technical solutions.
Understand client requirements and develop viable technical solutions by selecting appropriate frameworks.
Guarantee a stable and productive development environment for all team members.
Requirements
6+ years of experience in the software development area
Proven expertise in Data Warehousing and Data Analytics projects, encompassing data acquisition, transformation, and data science initiatives.
Exposure with modern data file formats such as Delta Tables, Apache Iceberg, and Parquet.
Proficiency in Python, Spark, DBT, and other data transformation tools and frameworks.
Expertise in data modelling, with familiarity in conceptual, logical, and physical data models, following methodologies like Data Vault 2.0 and Dimensional Modelling.
Experience with cloud data services such as Microsoft Intelligent Data Platform, AWS Data Services, or Google Cloud Data Analytics.
Minimum of 3 years experience with Snowflake
Hands-on experience with cloud data platforms like Databricks and/or Snowflake; certifications.
Skilled in data pipeline orchestration tools, including Azure Data Factory, AWS Glue, Apache Airflow, or Prefect.
Experience with CI/CD and MLOps pipelines to streamline development and deployment processes.
Proficient in Infrastructure as Code (IaC) for efficient infrastructure management.
Expertise in data migration, performance tuning, and optimization of databases and SQL queries.
Understanding of Data Governance and Data Management concepts.
Understanding design patterns, clean architecture, and clean coding principles.
Familiarity with unit, integration, and E2E testing.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.