Hybrid Data Architect, GCP, Snowflake

Posted 6 hours ago

Apply now

About the role

  • Data Architect leading data architecture and engineering projects within GCP and Snowflake environments for AI consulting company. Collaborate with clients to define data strategies and guide implementation teams.

Responsibilities

  • Lead data architecture and engineering projects within GCP and Snowflake environments, including migration, transformation, and platform design.
  • Design end-to-end data architectures (ingestion, storage, processing, and serving layers).
  • Define and guide implementation of data integration, ETL/ELT strategies, and pipeline architectures.
  • Develop logical and physical data models, schemas, and data structures across business domains.
  • Guide clients on data strategy, architecture decisions, and best practices across GCP.
  • Design and enforce data governance, security, and access control frameworks.
  • Work closely with security teams to ensure data protection and compliance.
  • Lead client workshops to identify data sources, flows, and requirements.
  • Define future-state architectures, roadmaps, and implementation plans.
  • Collaborate with cross-functional teams (data scientists, engineers, business stakeholders) to deliver data solutions.
  • Evaluate and select tools, frameworks, and architectural patterns.
  • Provide technical leadership and mentorship to engineering teams.
  • Oversee performance optimization, scalability, and cost efficiency of data platforms.

Requirements

  • 7+ years of experience in Data Engineering, Data Architecture, or Data Infrastructure roles.
  • 3+ years of experience working with GCP or similar cloud platforms (AWS, Azure).
  • Proven experience in designing data architectures and large-scale data platforms.
  • Strong experience with GCP managed services (BigQuery, Cloud SQL, Cloud Spanner, Cloud Bigtable).
  • Hands-on experience with Snowflake, including data modeling and performance optimization.
  • Strong expertise in data modeling, database design, and data architecture patterns.
  • Experience with ETL/ELT design, data integration, and migration from legacy systems.
  • Experience working with structured, semi-structured, and unstructured data.
  • Understanding of data governance, security, and compliance frameworks.
  • Experience with Infrastructure as Code (Terraform, Ansible, or similar).
  • Proficiency in SQL and good understanding of Python.
  • Strong analytical, problem-solving, and communication skills.
  • Experience working in a client-facing or consulting environment.

Benefits

  • Work in a supportive team of passionate enthusiasts of AI & Big Data.
  • Engage with top-tier global enterprises and cutting-edge startups on international projects.
  • Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
  • Accelerate your professional growth through career paths , knowledge-sharing initiatives, language classes, and sponsored training or conferences , including a partnership with Databricks , which offers industry-leading training materials and certifications.
  • Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off.
  • Participate in team-building events and utilize the integration budget .
  • Celebrate work anniversaries, birthdays, and milestones.
  • Access medical and sports packages , eye care, and well-being support services, including psychotherapy and coaching.
  • Get full work equipment for optimal productivity, including a laptop and other necessary devices.
  • Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.

Job title

Data Architect, GCP, Snowflake

Job type

Experience level

SeniorLead

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

HybridPoland

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job