Work with the Data Engineering team and report to Fardad Golshany, partnering closely with Engineering, Product, GTM, and Finance teams.
Design and build scalable analytics data marts in Snowflake using dbt that enable data science and insights across multiple business domains.
Maintain existing data pipelines and build new ETL and reverse ETL workflows to ensure reliable data movement across systems.
Stand up and scale orchestration infrastructure using Dagster or Airflow, implementing monitoring and alerting via Metaplane.
Build dynamic executive-facing dashboards in Sigma to reduce time to insight and enable AI-powered self-serve analytics.
Establish data governance standards with automated testing, comprehensive documentation, and semantic definitions that reduce ad-hoc queries.
Optimize dbt pipeline performance by implementing incremental models, improving materializations, and applying engineering best practices.
Requirements
You have 4+ years building production data pipelines and analytics infrastructure in cloud-native environments, with deep expertise in SQL and data transformation at scale.
You bring advanced production experience with dbt (2+ years), including writing macros, packages, and implementing best practices, plus hands-on experience with Snowflake performance tuning and cost optimization.
You have production experience with orchestration tools (Dagster or Airflow) and understand DAG design, dependency management, and monitoring.
You've built executive-level dashboards and BI solutions in Sigma (or similar tools) and understand how to structure semantic layers and optimize data models for AI-powered analytics.
You understand dimensional modeling and analytics architecture — designing data marts, fact/dimension tables, and semantic layers comes naturally to you.
You work autonomously and drive complex projects end-to-end, thriving in ambiguity while communicating clearly with business stakeholders across all levels.
Benefits
Incredible teammates: Work alongside some of the nicest and smartest people you'll ever meet.
Ownership mindset: We're all owners here, literally. Employees receive equity in Scribe, sharing in the company's long-term success.
Comprehensive coverage: We offer health, dental, and vision insurance for you and your dependents.
Time to recharge: Flexible paid time off, plus company holidays to rest and reset.
Retirement planning: Employees can contribute to a 401(k) plan to help plan for their future.
Support for growing families: Paid parental leave to help you care for and bond with your growing family.
Lunch, on us: SF-based employees receive daily catered lunches at our office.
Easy commutes: Commuter benefits for our office-based team, make getting to and from HQ simpler.
Level up your home office: Remote? Hybrid? Wherever you work, we'll support your setup with a home office stipend.
Data Engineer developing architecture and pipelines for data analytics at NinjaTrader. Empowering analysts and improving business workflows through data - driven solutions.
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.