Data Engineer creating advanced data solutions for Mastercard, a global payments technology company. Collaborate cross-functionally to implement robust data pipelines and architectures.
Responsibilities
Gather and understand the data engineering requirements based on the product and Engineering specifications
Conduct data discovery to build data models, schemas, tables, view, schemas deployment and many more
Building appropriate data pipelines in the application to support ETL and automate the pipelines
Build or use existing tools to execute one time ad-hoc data engineering asks
Implement and manage CI/CD pipelines using GitLab and Jenkins, enabling automated testing, deployment, and monitoring of data workflows
Build fact tables, conduct analysis and reporting as per needs
Conduct data analysis and perform data operations to support business decisions
Provide architecture guidelines and support from Data engineering aspects in product development
Apply exceptional problem-solving and analytical skills to troubleshoot complex data and system issues
Support Data Science team in building model deployment, data pipelines, serving etc
Document pipeline architecture, processes, and standards, promoting consistency and transparency across the team
Learn latest trends in data engineering concepts, tools and share with wider organization in brown bags etc
Provide thought leadership within this area for the entire department
Work with TPMs, product and relevant stakeholders while executing Data engineering assignments by leading the plan
Requirements
Expert in SQL development with hands-on experience in Databricks, Snowflake, Python, and PySpark for designing and implementing advanced data engineering solutions
Proven experience collaborating with stakeholders and cross-functional teams to understand business requirements and deliver reliable, high-impact cloud data solutions across Azure, AWS, and Cloud Data Warehouse platforms
Strong leadership and communication skills, with the ability to guide and mentor data engineering teams, ensuring effective collaboration between technical and non-technical stakeholders
Skilled in architecting and developing scalable, reusable data models, pipelines, and frameworks leveraging Hadoop, NiFi, and modern cloud Data Lake architectures
Experienced in planning and executing end-to-end deployments, upgrades, and migrations with minimal disruption to operations, ensuring adherence to best practices in cloud-native and distributed systems
Holds a Bachelor’s degree in Computer Science
Good understanding of payment networks
Skill in identifying new product ideas from data
Experience publishing and protecting scientific intellectual property work
Benefits
insurance (including medical, prescription drug, dental, vision, disability, life insurance)
flexible spending account and health savings account
paid leaves (including 16 weeks of new parent leave and up to 20 days of bereavement leave)
80 hours of Paid Sick and Safe Time
25 days of vacation time and 5 personal days, pro-rated based on date of hire
10 annual paid U.S. observed holidays
401k with a best-in-class company match
deferred compensation for eligible roles
fitness reimbursement or on-site fitness facilities
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.