Design, develop and maintain scalable, reliable data pipelines and APIs in cloud environments.
Collaborate with Data Science and Engineering teams to ensure data quality, governance and cost-effective performance.
Responsibilities
Work on the development, maintenance and evolution of scalable, reliable data solutions
Design, develop and maintain efficient, scalable and reliable data pipelines
Automate extraction, transformation and load (ETL/ELT) routines in cloud environments
Implement solutions for integrating data from multiple sources, including structured and unstructured data
Develop and maintain data APIs and services for consumption by internal and external applications
Ensure data quality, governance, security and integrity at all stages of the data flow
Work on relational and non-relational data modeling to support analytical and operational use cases
Document architectures, pipelines, processes and data models clearly, consistently and in a reusable manner
Collaborate with Data Science, Software Engineering teams and business stakeholders
Support continuous improvement of data processes, performance and cloud cost optimization
Requirements
Proficiency in SQL and database management
Experience with Python for data engineering and process automation
Knowledge of REST APIs, JSON formats and system integration
Version control using Git
Knowledge of data architecture and modeling (relational and non-relational)
Experience with Big Data tools (Spark, Hadoop or similar)
Experience with Cloud Computing, preferably AWS (Azure or GCP as alternatives)
Familiarity with software engineering best practices, including testing, versioning and CI/CD
Technical English for reading documentation and participating in technical discussions
Principal Data Engineer designing, building, and maintaining data pipelines for finance analytics at Northrop Grumman. Collaborating with engineers and finance analysts to ensure data accuracy and availability.
Senior Data Engineer responsible for migrating and modernising data platforms in banking. Rebuilding critical data platform with a focus on risk and core financial data flows.
Data Engineering Lead managing enterprise - scale data platforms using AWS, Snowflake, and Databricks in financial services. Leading data engineering teams and ensuring data governance.
AWS Data Engineer working in Gurugram to support data architecture and integration solutions. Collaborating and translating business needs into data models.
Senior Data Engineer handling data engineering responsibilities in hybrid setting for banking industry. Collaborating with cross - functional teams and maintaining data quality in Azure environments.
Data Management professional at Kyndryl involved in creating innovative data solutions and ensuring the seamless operation of complex data systems. Collaborating with teams to transform requirements into scalable database solutions.
Software Engineer designing and developing scalable data processing applications on cloud infrastructure for Thomson Reuters. Collaborating with Data Analysts on AI - enabled solutions for data management and insight generation.
Manager of Data Platform overseeing AWS cloud infrastructure and Snowflake data warehouses for Thomson Reuters. Leading the design and implementation of data processing applications in a hybrid role located in Bengaluru.
Senior Data Engineer designing scalable data pipelines and solutions for Enterprise Data Lake at Thomson Reuters. Collaborating across teams to ensure efficient data ingestion and accessibility.
Senior Data Engineer at Technis developing scalable data pipelines and solutions for innovative connected spaces products. Collaborating within a cross - functional team to deliver high - quality data - driven outcomes.