Data Engineer developing and maintaining scalable data platforms at Vintage Cash Cow. Collaborating with teams to ensure accurate and reliable data for decision making.
Responsibilities
Build and maintain a modern, scalable data platform that supports growth and decision-making.
Ensure data is accurate, consistent, and trusted across the business.
Improve speed, reliability, and automation of data pipelines and reporting workflows.
Enable high-quality self-serve analytics by delivering well-modelled, well-documented data sets.
Support digital performance and CRM insight through strong marketing data foundations.
Design, implement, and maintain robust data pipelines across multiple systems.
Ensure smooth, well-governed flow of data from source → warehouse → BI layers.
Support end-to-end warehouse design and modelling as our stack grows.
Integrate and manage a wide range of data sources within Snowflake.
Build automated checks to monitor accuracy, completeness, and freshness.
Run regular audits and troubleshoot issues quickly and calmly.
Identify opportunities to streamline pipelines, improve performance, and reduce cost.
Work closely with teams across Growth, Finance, Ops, and Product to understand KPIs and reporting needs.
Document architecture, pipelines, models, and workflows so everything is clear and easy to pick up.
Requirements
Strong Snowflake experience: loading, querying, optimising, and building views/stored procedures.
Solid SQL skills: confident writing complex queries over large datasets.
Hands-on pipeline experience using tools like dbt, FiveTran, Airflow, Coalesce, HighTouch, Rudderstack, Snowplow, or similar.
Data warehousing know-how and a clear view of what “good” looks like for scalable architecture.
Analytical, detail-focused mindset: you care about quality, reliability, and root-cause fixes.
Great communication: able to explain technical concepts in a simple, useful way.
Comfortable working in a small, high-impact team where you’ll shape the roadmap.
Experience working with HubSpot data (ETL into a warehouse, understanding the schema, reporting context) is nice to have.
Digital marketing analytics background: ads platforms, attribution, funnel performance, campaign measurement is nice to have.
Familiarity with CRMs/marketing automation tools (HubSpot, Marketo, Salesforce, etc.) is nice to have.
Python or R for automation, data wrangling, or pipeline support is nice to have.
Understanding of A/B testing or experimentation frameworks is nice to have.
Exposure to modern data governance/catalogue tooling is nice to have.
Data Architect leading design and implementation of cloud data platforms for digital transformation. Collaborating with stakeholders to define data strategies and governance models.
Data Engineer Consultant designing and optimizing data infrastructure for clients' business needs. Working with SQL and data visualization tools in a mainly remote role with some onsite responsibilities in Denver.
Data Engineer creating Real - Time Data Processing applications for a leading iGaming operator. Work involves stream data manipulation and collaboration in an Agile environment.
Data Engineer at Voodoo optimizing real - time data pipelines for gaming and consumer apps to support growth. Joining a top - tier data team dedicated to monetizing via advertising partners in a competitive landscape.
Cloud Data Engineer designing data architectures for cloud platforms at fifty - five. Collaborating with local and global teams to optimize marketing ROI and customer experience.
SAP Specialist responsible for designing, developing, and executing data migration objects in Hydro’s SAPEX program. Ensuring successful ETL processes and maintaining data quality.
Senior Data Engineer building scalable data pipelines and data models within retail at Avaron. Collaborating closely with business and technical teams to ensure reliable data solutions.
Senior Data Engineer building and operating the data platform at bsport. Collaborating with the Data team to optimize data intake and accessibility for analytics and AI.
Data Engineer building and maintaining Azure data platforms for Hultafors Group's analytics and reporting needs. Collaborating across various business functions in a cloud environment.
Lead Data Pipeline Manager at Valpak, overseeing data pipelines for environmental compliance initiatives. Collaborate with teams to ensure data quality and operational performance.