Data Engineer Tech Lead developing data solutions at Carelon. Leading a cross-functional team to optimize data workflows and maintain data integrity.
Responsibilities
Enforces coding standards, reviews code, and ensures maintainability.
Coaches developers on technical skills and problem-solving.
Helps team members grow in their technical careers.
Develop ETL/ELT processes to ingest and transform data from various sources including MongoDB, APIs, Snowflake and flat files (JSON, ORC, Avro, Parquet, CSV).
Implement data loading strategies into Snowflake and other data warehouses.
Write and optimize complex SQL queries for data extraction, transformation, and analysis across multiple database platforms.
Works with Product Managers, Architects, and other teams to align technical goals with business needs.
Leverage AWS services to manage and orchestrate data workflows, ensuring high availability and scalability.
Implementing the Job in AWS EMR, AWS Glue , AWS Lambda using pyspark.
Perform performance tuning on Spark jobs and SQL queries to ensure efficient data processing.
Participate in Agile ceremonies and contribute to sprint planning, story grooming, and retrospectives.
Collaborate with cross-functional teams including data scientists, analysts, and DevOps engineers to deliver data solutions.
Ensure data accuracy, consistency, and integrity through validation and quality checks.
Maintain documentation for data pipelines, schemas, and data flow processes.
Work with Kafka or similar technologies to build and maintain real-time data streaming solutions.
Develop and maintain data ingestion processes from external/internal APIs, ensuring secure and reliable data flow.
Requirements
5+ years’ experience in Spark ecosystem, Python/Scala programming, MongoDB data loads, Snowflake and AWS platform (EMR, Glue, S3)
6+ years’ IT experience and good expertise in SDLC/Agile
6+ years’ experience in SQL, complex queries, and optimization
Coaches developers on technical skills and problem-solving.
Hands on experience in writing advanced SQL queries, familiarity with variety of databases.
Experience in coding solutions using Python/Spark and performing performance tuning/optimization.
Experience in building and optimizing ‘Big-Data’ pipelines in Cloud.
Experience in handling different file formats like JSON, ORC, Avro, Parquet, CSV.
Hands on experience in data processing with NoSQL databases like MongoDB.
Familiarity and understanding of jobs scheduling.
Hands on experience on working with APIs to process data.
Understanding of data streaming, such as Kafka services.
Certification on Snowflake (Snow PRO certification), and AWS (Cloud Practitioner/Solution Architect).
Hands-on experience on Kafka streaming pipelines implementation.
Benefits
Extensive focus on learning and development
An inspiring culture built on innovation, creativity, and freedom.
Principal Data Architect at PointClickCare ensuring coherent and scalable data architecture. Driving unified data direction while collaborating with Engineering Architecture team for AI enablement.
Lead Data Engineer responsible for evolving Manna’s data infrastructure for drone delivery. Overseeing data architecture and analytics while building scalable data pipelines.
Data Engineer designing, implementing, and optimizing data pipelines for DeepLight AI. Collaborating closely with a multidisciplinary team to analyze large - scale data.
Data Engineer designing and maintaining scalable ETL pipelines at Satori Analytics. Collaborating with teams to deliver high - quality analytics solutions across various industries.
Data Architect responsible for defining enterprise data architecture on AWS and Databricks Lakehouse platforms. Enabling scalable data lakes and enterprise analytics for financial services organizations.
Data Platform Operations Support leading data engineering strategy across projects for EXL. Driving innovation and optimization while collaborating with various teams in the organization.
Manager II leading data engineering projects at Navy Federal Credit Union. Overseeing data governance and quality initiatives while managing engineering teams in a hybrid work environment.
Senior Data Engineer building and maintaining data pipelines for cloud and AI solutions at Qodea. Collaborating with ML engineers and focusing on reliability and performance in a cloud - native environment.
Principal Data Engineer responsible for architecting scalable data pipelines and building high - quality data foundations. Collaborating closely with experts to ensure data readiness for advanced analytics.
Senior Data Engineer at Qodea designing scalable data pipelines and infrastructure. Delivering solutions utilizing cutting - edge tools and collaborating closely with teams for impactful results.