Data Engineer at Strider Technologies designing and scaling data platforms using cloud technologies. Collaborating with teams to create a robust analytics and BI infrastructure.
Responsibilities
Design, implement, and maintain scalable data pipelines (ETL/ELT) in a modern cloud data warehouse (e.g., Databricks, Snowflake, BigQuery, or Redshift).
Ingest, transform, and integrate internal system data (e.g., CRM, product, finance) and external vendor datasets.
Prepare and process structured and unstructured data to support analytics, business intelligence, and automation use cases.
Model and optimize datasets using Lakehouse or warehouse-based architectures for downstream consumption.
Collaborate closely with analysts and cross-functional partners to build production-grade data assets.
Champion and implement best practices for data quality, observability, and lineage.
Partner with cloud infrastructure teams (AWS, Azure, or GCP) to ensure scalable, secure, and cost-efficient systems.
Explore and deploy AI-enabled automation tools and workflows across the business.
Advocate for productivity-boosting technologies such as ChatGPT, Copilot, Cursor, and dbt-assist.
Help document standards, develop reusable components, and mentor junior data engineers.
Requirements
5+ years of experience in data engineering or analytics engineering roles.
Strong proficiency with SQL and Python (or Scala).
Hands-on experience with modern cloud data warehouses (Databricks, Snowflake, BigQuery, or Redshift).
Familiarity with tools in the modern data stack (e.g., dbt, Airflow, Fivetran, Dagster).
Proven ability to manage and transform large, complex datasets across structured and unstructured formats.
Solid understanding of data warehousing concepts, dimensional modeling, and Git-based version control.
Experience leveraging AI-assisted development tools and enthusiasm for scaling their use internally.
Comfortable working in fast-paced environments and collaborating across technical and non-technical teams.
Benefits
Competitive Compensation
Company Equity Options
Flexible PTO
Wellness Reimbursement
US Holidays (Office Closed)
Paid Parental Leave
Comprehensive Medical, Dental, and Vision Insurance
Palantir Data Engineer supporting production data pipelines at a global data and AI company. Monitoring workflows, troubleshooting issues, and ensuring data system reliability.
Senior Azure Data Architect responsible for designing, deploying, and managing scalable cloud infrastructure on Microsoft Azure. Requires over 9 years of experience in cloud architecture and engineering.
Senior Data Engineer designing and building automated data platforms and pipelines for Coody clients. Delivering efficient solutions in a consultant role across exciting industries and companies.
Senior Data Platform Consultant driving analytics solutions on Databricks for Snap's clients. Leading architecture and delivery while shaping Databricks capability growth within the team.
Data Engineer Associate developing and implementing data solutions within PNC's Asset Management Group. Collaborating on technical solutions using PySpark, Hadoop, and SQL for scalable data systems.
Senior Data Engineer at Assembly working on data integration, transformation, and analytics collaboration. Handling cloud services and data quality across data projects with cross - functional teams.
Senior Data Engineer developing scalable data pipelines and collaborating with cross - functional teams at Technis. Technical guidance in a hybrid work environment based in Lausanne, Switzerland.
Data Engineer designing and maintaining data pipelines for CIEE, a philanthropic institution supporting youth development. Collaborating with Data Analysts for data quality and reliability.
Senior Data Engineer responsible for designing and implementing data solutions at Harambee. Collaborating with various stakeholders to enhance technology supporting work - seekers' journeys.
Senior Manager Data Engineer at Squarcle delivering technical leadership in data engineering and compliance with business objectives. Leading teams to optimize and develop data platforms for clients.