Senior Data Engineer designing and building data warehouse solutions with Snowflake for a fintech company. Collaborating with cross-functional teams to facilitate data insights and analytics.
Responsibilities
Design and build a data warehouse using computing–based data cloud (Snowflake) for machine learning, data analysis, self-serve analytics and reporting needs
Develop Extract, Transform, Load (ETL) pipelines to orchestrate execution of scripts, automate data transformation and loading data into data warehouses
Perform data cleansing, validation, testing and schema design to ensure accuracy and reliability of data insights
Manage and build automated workflows and monitor these pipelines and DAGs (Directed Acyclic Graphs)
Identify and work on improvements by automating manual processes to optimize data delivery, re-design infrastructure for more scalability using python & sql
Collaborate with cross-functional teams including data engineers, analysts and data scientists to understand the business logic and needs to create an effective and efficient pipeline and data model
Create ad-hoc report/data visualizations based on user requirements
Participate in ETL flow design reviews and recommend solutions to improve processes
Perform data quality checks and alerting to identify issues and resolve bugs in a timely manner
Participate in source code and design reviews in a software development lifecycle (SDLC) driven environment using technical, functional, and domain knowledge
Identify issues or gaps and suggest improvements to the team processes
Provide support for customer requests for all the products handled by the team.
Requirements
Master’s degree (or its foreign degree equivalent) in Analytics, Data Science, Engineering (any field), or a related quantitative discipline
Six (6) months of experience in the job offered or in any occupation in related field
Data warehouse design and dimensional modeling
Extract, Transform, Load (ETL)
SQL
Python
Data Visualization (Tableau or Power BI)
Gitlab
Azure
PySpark
OLAP and OLTP system experience (MySQL Database, Postgres DB, or Snowflake)
Machine Learning
Data Pipeline - Performance & Scalability Techniques.
Junior Data Engineer role at Allegro focusing on geospatial data transformation and implementing automation workflows using SQL/Python. Opportunity to work with leading data teams and technologies.
Data Engineer II at Nium focused on building scalable data solutions for global payments. Collaborate with teams on cloud migration and high - impact data projects in a hybrid environment.
Senior GCP Data Engineer leading the design and optimization of scalable data platforms on Google Cloud. Collaborating with cross - functional teams for analytics and business - critical applications.
Data Engineer Lead developing and maintaining data applications and pipelines in Azure for State Street. Collaborating with teams to ensure data integrity and implementing data processing principles.
Data Architect designing and implementing data architectures for pharmaceutical clients. Collaborating with teams to create efficient data ecosystems that drive business value.
Working student in Automation & Data Engineering at Kelvion supporting internal tools and automations using Microsoft Power Platform. Involvement in systems configuration, reports, dashboards, and data structure maintenance.
Senior AWS Data Engineer supporting HHS grant award management, providing technical leadership and maintaining cloud data systems. Ideal for candidates excelling in modernization and operational environments.
Data Engineer at Booz Allen developing technology solutions for clients analyzing large datasets. Collaborating with teams on mission - driven projects using data engineering best practices.
Data Engineer supporting data pipelines for business insights and GenAI applications at Assurity Trusted Solutions. Collaborating with teams on data workflows in a hybrid environment.
Senior Data Engineer leveraging big data and cloud expertise to build data pipelines at Alight. Ensuring reliability, governance, and operational excellence across data platforms.