Senior Technical Lead/Architect focusing on designing and implementing AWS data ingestion pipelines. Collaborating with teams to enhance operational SLAs and data quality workflows.
Responsibilities
Provide technical solution discovery effort on any new capabilities or new functionality.
Assist PO with technical user stories to ensure healthy backlog features
Lead the development of real-time data pipelines using AWS DMS, MSK, Kafka or Glue Streaming and for CDC ingestion from multiple SQL Server sources (RDS/on-prem).
Build and optimize streaming and batch data pipelines using AWS Glue (PySpark) to validate, transform, and normalize data to Iceberg and DynamoDB.
Define and enforce data quality, lineage, and reconciliation logic with support for both streaming and batch use cases.
Integrate with S3 Bronze/Silver layers and implement efficient schema evolution and partitioning strategies using Iceberg.
Collaborate with architects, analysts, and downstream application teams to design API and file-based egress layers.
Implement monitoring, logging, and event-based alerting using CloudWatch, SNS, and EventBridge.
Mentor junior developers and enforce best practices for modular, secure, and scalable data pipeline development.
Requirements
6+ years of hands-on expert level data engineering experience in cloud-based environments (AWS preferred) with event driven implementation
Strong experience with Apache Kafka / AWS MSK including topic design, partitioning, and Kafka Connect/Debezium
Proficiency in AWS Glue (PySpark) and for both batch and streaming ETL
Working knowledge of AWS DMS, S3, Lake Formation, DynamoDB, and Iceberg
Solid grasp of schema evolution, CDC patterns, and data reconciliation frameworks
Experience with infrastructure-as-code (CDK/Terraform) and DevOps practices (CI/CD,Git
Senior Data Engineer designing and scaling data foundations for AI adoption across Ad Tech. Collaborating with cross - functional teams to deliver robust pipelines for high - profile AI applications.
Specialist in Data Engineering leading pipeline optimization at Inmetrics. Collaborating in innovative data - driven projects within a hybrid work environment.
Data Architect responsible for designing and implementing data architecture at Stefanini. Collaborate with technical teams and stakeholders in a hybrid work environment.
Senior Data Engineer at Reos responsible for scalable ETL pipelines using Microsoft Fabric. Focused on data integration from various sources and data modeling processes.
Junior Data Engineer developing and maintaining data pipelines for AI - powered identity platform at Saviynt. Collaborating with senior engineers, analysts, and BI developers to ensure reliable data for decision - making.
Strategic technical leader architecting data landscape for Sales and Trade at Conagra Brands. Designing scalable solutions and enhancing data integration across enterprise platforms.
Consultant - Data Engineering Specialist supporting public health surveillance data ecosystem. Focused on automated data integrations and ensuring data flows securely across systems.
Data Engineer developing and maintaining data pipelines for SkyShowtime's streaming data ecosystem. Collaborating with teams to facilitate analysis and operationalise data processing systems.
Tech Lead Data Engineering overseeing data engineering, ETL processes, and cloud technologies. Leading project delivery with strong hands - on experience in Informatica, Python, and GCP.
Data Engineer developing and maintaining data infrastructure for healthcare solutions in Portugal. Working with Oracle databases and Pentaho ETL pipelines in a flexible hybrid model.