Senior Data Engineer building data collection and transformation frameworks on Snowflake and Azure. Leading architecture and design of complex data solutions in a hybrid work environment.
Responsibilities
Create data collection, extraction, and transformation frameworks for structured and unstructured data.
Develop and maintain infrastructure systems on the cloud (e.g. data warehouses, data lakes) including data access points.
Prepare and manipulate data using Azure SQL, Azure Data Factory, Databricks and other data pipeline tools.
Organize data into formats and structures that improve reuse and efficient delivery to businesses and analytics teams and system applications.
Integrate data across data lake, data warehouse and systems applications to ensure the delivery of information across the enterprise.
Lead the development and evolution of the data service layer, driving integration of components to deliver an excellent customer offering.
Evaluate data architecture and integrations, with a focus on ongoing optimization and enhancement of solution capabilities.
Engage critically with partners to establish clear needs and link to solutions, including setting up prototypes, and involving multiple parties in design sessions.
Lead the architecture, design and implementation of complex data solutions and integrations including best practices for the full development life cycle, coding standards, code reviews, source control management, build processes, testing, and operations.
Perform database monitoring, collaborate with database administrators to optimize database performance.
Lead the end-to-end migration of on-premises data solutions to Microsoft Azure, including re-architecting legacy systems, optimizing data pipelines for cloud scalability, and ensuring secure, high-performance integration across platforms.
Collaborate with team members to ensure adherence to standards for code, design, documentation, testing and deployment
Collaborate with data governance and strategy to ensure data lineage is well understood and constructed in a way to highlight data re-use and simplicity.
Work to assess new opportunities to simplify the data operation with new tools, technologies, file storage, management, and process.
Resolve data load issues to ensure seamless execution of downstream processes.
Conduct thorough testing and implement proactive measures to prevent recurrence and improve overall data pipeline reliability.
Requirements
5+ years of hands-on experience in relational database development (preferably Microsoft SQL Server or Oracle)
Minimum of 7 years of experience working in data warehousing/management, data architecture and supporting methodologies including 5+ years of hands-on experience in data modelling and data integration.
3+ years of experience designing, developing, and testing Azure data pipelines.
Experience migrating complex data solutions from on-premises infrastructure to cloud environments is a strong asset.
Familiar with software design pattern and best practices
Bachelor's degree required, Masters an asset, in Software Engineering, Computer Science; or equivalent work experience in a Technology or business environment
Proficient in multiple programming languages and coding.
Excellent ability to design and engineer enterprise solutions.
Experience manipulating, processing and extracting value from large, disconnected datasets.
Excellent verbal and written communication skills.
Knowledge of Business Intelligence, Analytics & Reporting.
Familiar with dashboard building and Data Visualizations
Experience working with DevOps pipelines (Git, Maven, Gitlab, Jenkins), continuous integration/delivery, automated testing (unit, functional, performance)
Benefits
Comprehensive Total Rewards Program, including performance-based bonuses
Flexible benefits starting from day one
Choice of a health spending account (HSA) or personal spending account (PSA)
Retirement planning support, with profit-sharing programs including company match and a defined contribution pension plan
Growth & development opportunities, including unlimited access to Coursera, mentorship programs, and an internal gig marketplace
Holistic wellness support, with an Employee & Family Assistance Program, 24/7 virtual healthcare, and workplace wellness initiatives
Flexibility that works for you, including hybrid work arrangements, a Work from Abroad program, and paid time off programs
Recognition and rewards, with company-wide recognition programs, exclusive banking perks from RBC & BMO, and access to great employee discounts
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.