Data Engineer at Borrowell, building and maintaining data infrastructure. Collaborating on solutions using Snowflake, Apache Airflow, and other tools.
Responsibilities
Implement end to end data pipelines to extract and transform data from a variety of sources including unstructured and fully structured data for multiple business initiatives
Design and implement reusable base data models in Snowflake to support multiple functions such as analytics, data science and marketing
Optimize data pipelines with a focus on data quality to improve data observability
Work cross functionally with other data and business functions to champion data engineering initiatives and ensure that projects are delivered on time aligned to stakeholder expectations
Hold an impactful role in contributing to data infrastructure decisions/implementations
Requirements
2+ years experience building end to end ETL/ELT data pipelines
Experience working with:
Object-oriented and functional scripting languages (Python)
Query authoring (SQL) as well as practical familiarity with relational databases
Data modelling techniques for data warehousing
Data platforms (Snowflake/Redshift/BigQuery)
**Nice to Haves: **
Experience working with:
Data transformation tool (DBT)
Workflow orchestration tools (Airflow/Dagster)
Containerized applications (Docker)
Familiarity with cloud platforms (Azure/AWS/GCP)
**Important Qualities:**
Desire to continuously learn how to implement the latest technologies and analytical tools into our tech stack
Willingness to embrace and act upon feedback
Open and transparent communication, ability to adjust communication to suit both technical and non-technical audiences
Benefits
**The Opportunity** - join and have a major impact at a growing company that is helping Canadians feel confident about money.
**Comprehensive Health Benefits **- medical, dental, vision, and paramedical health benefits for you and your family, with extra yearly coverage for psychotherapists and massage therapists
**Additional Health Benefits** - virtual benefit offering that allows you to connect 24/7 with nurses, doctors and mental health professionals
**Maternity & Parental Leave Top-up** - available to new parents
**WFH Reimbursement **- we ship you gear like a laptop, mouse, keyboard, and you can reimburse additional items to make your workplace better for you
**Employee Development Benefit **- annual reimbursements on payments to help your learning
**Givewell Benefit **- 1 paid volunteer day a year to give back to the community
**Flexibility** - flexible working hours and a flexible vacation policy
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.