Implementation Engineer guiding customers in data architecture and integration solutions at MotherDuck. Collaborating with teams and leading technical discussions to ensure customer success.
Responsibilities
Design architectures that make good data work feel easy.
Remove blockers with clarity, and help customers reach production with confidence.
Lead technical discussions with customer data and engineering teams.
Apply unorthodox thinking when standard approaches fall short, helping teams rethink assumptions.
Exercise agency by owning the customer journey from proof of concept to production.
Anticipate risks, drive alignment, and allocate internal resources to keep momentum.
Diagnose technical blockers with precision and thoughtfulness.
Navigate ambiguity with flexibility, and escalate nuanced architectural questions to MotherDuck engineering when needed.
Advise customers on ETL and ELT patterns, modeling choices, and integration practices.
Collaborate with Sales, Customer Engineering, and Support to deliver cohesive, high-quality customer experiences.
Create reusable artifacts, best practices, and reference architectures that reflect MotherDuck’s values.
Communicate complex technical concepts with clarity, empathy, and thoughtfulness for both technical and non-technical audiences.
Requirements
5+ years designing, building, or implementing modern data warehouses such as Snowflake, BigQuery, Redshift, or Databricks
Mastery of SQL and data modeling, with the judgment to select the simplest viable approach
Experience translating business requirements into architectural decisions that balance flexibility with long-term scalability
Background in a customer-facing or solutions-oriented role such as solutions engineer or consulting data architect
Evidence of agency in past roles. You proactively drive structure, unblock teams, and make progress without waiting for direction
A thoughtful approach to ambiguity, bringing order where needed and staying flexible where it matters
Willingness to challenge conventional approaches when they don't serve the customer or the mission
Nice to have: experience with DuckDB or MotherDuck; familiarity with Fivetran, dbt, or Airflow; experience in consulting or professional services
Benefits
Feather-ruffling compensation – competitive salary and stock options so you have a stake in our flock’s success.
Top-notch healthcare coverage – 100% paid medical, dental, and vision for employees, plus 80% coverage for dependents (because we care about your whole nest).
Flexible PTO – take the time you need to recharge, explore, or just have a lazy day by the pond.
401k plan – because even ducks need to plan for the future.
Legendary company events – we bring the whole flock together twice a year for unforgettable summits in fun locations, plus 1-2 team gatherings a year to keep our bonds strong.
Flexible work environment - spend most of the week in the office collaborating with the flock, and work from wherever you're most productive the rest of the time. Specifics are role dependent and as agreed upon with your manager
Data Engineer/Senior Data Engineer developing scalable ETL/ELT pipelines and architecting data systems at Manulife. Collaborating with data professionals to ensure data quality and compliance.
Data Engineer at Pruna AI merging data engineering, analytics engineering, and revenue operations. Working on intelligent automation and analytics for accessible and sustainable AI.
Data Engineer developing modern cloud - based data pipelines at UK Biobank for research support. Collaborating within the Data & Technology team to create clean, scalable, secure code.
Senior Consultant SAP Data Migration leading projects in data migration for various clients. Collaborating with project teams to ensure successful strategy implementation and data quality management.
Data Engineering Team Lead engaging with enterprise - level organizations on data architecture using Google Cloud solutions. Leading a team of data engineers in a hybrid work model.
Senior Lead Data Engineer in Bangalore, Karnataka, India designing and building AWS data engineering solutions. Leading teams on scalable data pipelines using Spark, PySpark, and Python.
Data Engineer creating and implementing Big Data applications at Absa with business stakeholders and technology leaders. Involves ETL/ELT pipeline design, data automation, and team mentorship.
Senior Data Architect delivering, enhancing, and adopting enterprise data and analytics products for DoD organizations. Collaborating with teams to translate requirements into scalable solutions for national security outcomes.
Senior Data Engineer supporting the delivery and enhancement of enterprise data and analytics for DoD organizations. Collaborating with engineers and government partners on scalable, production - ready solutions.
Senior Data Engineer at Skillfield designing distributed data processing solutions using Apache Spark. Collaborates on cloud and on - prem solutions across enterprise levels in a hybrid work environment.