Senior Data Engineer designing and building data warehouse solutions with Snowflake for a fintech company. Collaborating with cross-functional teams to facilitate data insights and analytics.
Responsibilities
Design and build a data warehouse using computing–based data cloud (Snowflake) for machine learning, data analysis, self-serve analytics and reporting needs
Develop Extract, Transform, Load (ETL) pipelines to orchestrate execution of scripts, automate data transformation and loading data into data warehouses
Perform data cleansing, validation, testing and schema design to ensure accuracy and reliability of data insights
Manage and build automated workflows and monitor these pipelines and DAGs (Directed Acyclic Graphs)
Identify and work on improvements by automating manual processes to optimize data delivery, re-design infrastructure for more scalability using python & sql
Collaborate with cross-functional teams including data engineers, analysts and data scientists to understand the business logic and needs to create an effective and efficient pipeline and data model
Create ad-hoc report/data visualizations based on user requirements
Participate in ETL flow design reviews and recommend solutions to improve processes
Perform data quality checks and alerting to identify issues and resolve bugs in a timely manner
Participate in source code and design reviews in a software development lifecycle (SDLC) driven environment using technical, functional, and domain knowledge
Identify issues or gaps and suggest improvements to the team processes
Provide support for customer requests for all the products handled by the team.
Requirements
Master’s degree (or its foreign degree equivalent) in Analytics, Data Science, Engineering (any field), or a related quantitative discipline
Six (6) months of experience in the job offered or in any occupation in related field
Data warehouse design and dimensional modeling
Extract, Transform, Load (ETL)
SQL
Python
Data Visualization (Tableau or Power BI)
Gitlab
Azure
PySpark
OLAP and OLTP system experience (MySQL Database, Postgres DB, or Snowflake)
Machine Learning
Data Pipeline - Performance & Scalability Techniques.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.