Data Engineer designing and maintaining scalable data solutions in GCP and Snowflake environments. Collaborating with clients and stakeholders to ensure data quality and functionality.
Responsibilities
Design, develop, test, and maintain data pipelines and ETL/ELT processes using GCP and Snowflake.
Implement data ingestion, transformation, and storage solutions for structured, semi-structured, and unstructured data.
Build and optimize batch, micro-batch, and real-time data pipelines.
Support data migration from legacy systems to cloud platforms (GCP, Snowflake).
Collaborate with business and technical stakeholders to translate requirements into scalable data solutions.
Work with GCP services such as BigQuery, Cloud SQL, Cloud Spanner, and Cloud Bigtable.
Integrate data from various sources and support data platform development.
Ensure data quality by implementing validation rules, testing frameworks, and monitoring solutions.
Work closely with security teams to ensure data protection, access control, and compliance.
Support development of data models and schemas for analytics and reporting.
Contribute to CI/CD processes, version control, and infrastructure automation (e.g., Git, Terraform).
Collaborate with data scientists, analysts, and engineers to support data-driven use cases.
Requirements
5+ years of experience in Data Engineering or similar role.
2+ years of experience working with GCP or similar cloud platforms (AWS, Azure).
Hands-on experience with GCP managed data services (e.g., BigQuery, Cloud SQL, Cloud Spanner, Cloud Bigtable).
Experience working with Snowflake.
Strong knowledge of SQL and experience with data transformation tools (e.g., DBT or similar).
Proficiency in Python for data processing and scripting.
Experience with ETL/ELT processes and data pipeline development.
Experience working with structured, semi-structured, and unstructured data.
Familiarity with data orchestration tools (e.g., Airflow, Dagster or similar).
Experience with version control (Git) and CI/CD practices.
Experience with Infrastructure as Code (e.g., Terraform, Ansible, or similar).
Strong analytical, problem-solving, and troubleshooting skills.
Excellent written and verbal communication skills in English.
Experience in a client-facing or consulting environment is a plus.
Benefits
Work in a supportive team of passionate enthusiasts of AI & Big Data.
Engage with top-tier global enterprises and cutting-edge startups on international projects.
Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
Accelerate your professional growth through career paths, knowledge-sharing initiatives, language classes, and sponsored training or conferences, including a partnership with Databricks, which offers industry-leading training materials and certifications.
Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off.
Participate in team-building events and utilize the integration budget.
Celebrate work anniversaries, birthdays, and milestones.
Access medical and sports packages, eye care, and well-being support services, including psychotherapy and coaching.
Get full work equipment for optimal productivity, including a laptop and other necessary devices.
Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
Senior Data Engineer developing high - impact data solutions in a collaborative financial team. Integrating data systems and ensuring performance with innovative technologies.
Senior Data Engineer developing data pipelines and infrastructure on Google Cloud Platform for WorkWhile's staffing marketplace. Collaborating with Data Science and Engineering teams to enhance data quality and availability.
Data Engineer developing data platforms for a consulting firm focused on quality solutions. Collaborating within a small team to deliver robust infrastructure and systems.
Senior Data Engineer designing and maintaining data pipelines within a fast - growing social impact startup. Collaborate cross - functionally to enhance products and analytics capabilities.
Senior Associate in data engineering at PwC focusing on designing robust data solutions. Leading complex data pipeline projects and collaborating with cross - functional teams to support automation and analytics.
Data Engineering & Warehousing Manager overseeing data engineering and warehousing operations at Hastings Insurance. Leading pipelines, platforms, and technical teams for enterprise data insights.
Software Engineer contributing to Machine Learning initiatives and data infrastructure at AKASA. Working in a hybrid setup between South San Francisco and NYC.
Senior Data Engineer delivering scalable data solutions in data engineering team at fintech startup. Building and maintaining data pipelines, collaborating with cross - functional teams for accurate data delivery.
Senior Enterprise Data Architect at Fresenius Kabi shaping data governance and architecture strategies across the enterprise for data - driven decision - making.
Database Engineer I focused on acquisition and integration of data into Data Lake. Responsible for management of databases and leading company’s data strategy.