Data Engineer focusing on data pipeline development and analytics for a global tech company. Collaborating with teams to ensure data availability and quality standards.
Responsibilities
Design, develop, and maintain data pipelines (batch and streaming) for ingestion, transformation, and delivery of data for analytics and application consumption.
Build and evolve analytical modeling (bronze/silver/gold layers, data marts, star schemas, wide tables), ensuring consistency, documentation, and reusability.
Implement data quality best practices (tests, validations, contracts, SLAs/SLOs, monitoring of freshness/completeness/accuracy) and manage incident resolution with root cause analysis (RCA).
Define and maintain technical data governance: catalog, lineage, versioning, naming conventions, ownership, access policies, and audit trails.
Optimize performance and cost of queries and pipelines (partitioning, clustering, incremental loads, materializations, job tuning).
Support the full delivery lifecycle (discovery → development → validation → operations), aligning business requirements with technical needs and ensuring predictability.
Collaborate with BI/Analytics teams to define metrics, dimensions, facts, and the semantic layer, ensuring traceability of key indicators.
Enable and operationalize AI/ML use cases.
Integrate sources and systems (APIs, databases, queues, events, files), ensuring security, idempotency, fault tolerance, and end-to-end traceability.
Produce and maintain technical and functional documentation relevant for auditing, support, and knowledge transfer.
Requirements
Proven experience as a Data Engineer focused on Analytics (building pipelines, modeling, and making data available for consumption).
Strong command of SQL and solid experience with Python (or an equivalent language) for data engineering and automation.
Experience with orchestration and workflow design (e.g., Airflow, Dagster, Prefect, or similar).
Experience with data warehouses/lakehouses and analytical formats/architectures (e.g., BigQuery, Snowflake, Databricks, Spark; Parquet, Delta, Iceberg).
Hands-on experience with ETL/ELT, incremental loads (CDC when applicable), partitioning, and performance/cost optimization.
Knowledge of data quality and reliability best practices (data testing, observability, metrics, incident management, and RCA).
Experience with version control (Git) and delivery practices (code review, branching patterns, basic CI).
Strong verbal and written communication skills for interacting with technical teams and stakeholders, with the ability to translate requirements into clear deliverables.
Benefits
Health and dental insurance;
Meal and grocery allowance;
Childcare assistance;
Extended parental leave;
Partnerships with gyms and health/wellness professionals via Wellhub (Gympass) TotalPass;
Profit-sharing program;
Life insurance;
Continuous learning platform (CI&T University);
Employee discount club;
Free online platform dedicated to physical and mental health and wellbeing;
Principal Data Architect at nbn defining and evolving enterprise data models for the digital future of Australia. Providing leadership in data governance and advanced analytics practices.
Data Engineering Advisor creating data systems and pipelines for data management solutions. Collaborating with stakeholders and using analytics to solve business problems in the financial sector.
Senior Finance Data Architect responsible for shaping finance data strategy at Standard Life. Leading enterprise - level data architecture for regulatory reporting and strategic insights.
Cloud Data Engineer at SEB focusing on Customer Relationship Management. Joining the Cloud Data Engineering Team to optimize data pipelines and improve customer insights.
Data Architect designing scalable data architectures for analytics and reporting at XTEL. Collaborating with international teams to ensure data quality and infrastructure improvements.
Lead Enterprise Data Architect building and owning foundational data management capabilities at a technology - driven company. Enhancing data architecture for AI and operational use with strategic leadership and technical expertise.
Associate Data Engineer supporting data engineering projects at The Hartford in Hartford, CT and Charlotte, NC. Engaging in projects that involve data analysis and developing data assets using various technologies.
Senior Data Engineer / Snowflake Architect leading the design and optimization of data solutions. Working closely with clients and internal teams to build scalable architectures in a hybrid environment.
Data Engineer II developing ETL/ELT solutions for higher education data warehouse. Ensuring reliable institutional data for strategic decision - making by university leaders.