Data Engineer specializing in Snowflake at Cayuse Commercial Services, focusing on data pipelines and analytics capabilities. Collaborating with business intelligence teams to enhance decision-making.
Responsibilities
Utilize SQL window functions to perform advanced data calculations, such as running totals, moving averages, and rankings, enabling deeper insights into sales trends and performance metrics.
Prepare and automate detailed real-time reports using Python and SQL, providing actionable insights to stakeholders and enhancing the company’s decision-making processes.
Develop and optimize data pipelines using Snowpipe and SnowSQL for real-time and batch data ingestion, achieving improved data processing efficiency by 20%.
Implement Snowpark to build scalable and high-performance data applications using Python, enhancing data manipulation, analysis, and processing capabilities.
Leverage Snowflake’s Time Travel features to enable auditing, data recovery, and version control, ensuring data integrity and regulatory compliance.
Manage Snowflake's Zero Copy Cloning to create cost-effective development, testing, and staging environments without incurring additional storage costs.
Collaborate with product managers to develop dynamic dashboards on Tableau, enabling tracking of key supply chain metrics and effectively identifying Key Risk Indicators (KRIs) and Key Performance Indicators (KPIs).
Requirements
Bachelor's degree in Computer Science, Data Engineering, Information Systems, or a related field.
Proven experience working with Snowflake Data Cloud and its key functionalities, including Snowpark, Snowpipe, and Time Travel.
Proficiency in programming with Python for data processing and analytics.
Advanced SQL skills, including working with window functions for complex calculations and analysis.
Experience developing interactive dashboards with Tableau or similar visualization tools.
Strong understanding of data engineering best practices, including data integrity, compliance, and real-time data ingestion techniques.
Familiarity with cloud data architecture and concepts such as Zero Copy Cloning and scalable application development.
Certification in Snowflake or related cloud data platforms (e.g., AWS, Azure, GCP) - preferred.
Experience in supply chain analytics or supporting business systems in a large-scale environment - preferred.
Strong collaboration skills and ability to work cross-functionally with product managers, analysts, and other engineering teams - preferred.
Associate Data Engineer building and maintaining data systems at Incedo. Transforming raw information into accessible datasets for decision - making and collaborating with analysts and data scientists.
Intern Data Engineer at Flutter Studios learning to create applications in the gaming industry. Join a cross - functional team to implement a basic application in an Agile environment.
AI/Data Engineer at Comcast developing data pipelines and AI solutions for audit processes. Leading team efforts to ensure data quality and compliance with audit objectives across business units.
Senior Data Engineer involved in AWS Cloud and Big Data solutions for Financial Crime. Join CommBank's team to tackle complex data - centric problems with advanced technologies.
Senior Data Engineer leading and mentoring a team in building scalable data pipelines for digital transformation projects in an international software company.
Senior Data Operations Engineer at iKnowHow designing and implementing scalable data - driven applications. Focus on data pipelines, APIs, and collaboration across teams for project success.
Data/AI Engineer supporting innovations in corporate travel with AWS technologies at HRS Group. Collaborating with data teams to develop AI solutions and maintain data pipelines.
Data Engineer developing data solutions for reporting and analytics at Assurant. Implementing and optimizing Data Warehouse solutions in cloud and on - premise environments with Agile methodology.
AI Engineer developing AI - driven analytics for trading at Deloitte. Focusing on scalable data pipelines and collaboration with traders for actionable insights.