Data Architect leading data architecture and engineering projects within GCP and Snowflake environments for AI consulting company. Collaborate with clients to define data strategies and guide implementation teams.
Responsibilities
Lead data architecture and engineering projects within GCP and Snowflake environments, including migration, transformation, and platform design.
Design end-to-end data architectures (ingestion, storage, processing, and serving layers).
Define and guide implementation of data integration, ETL/ELT strategies, and pipeline architectures.
Develop logical and physical data models, schemas, and data structures across business domains.
Guide clients on data strategy, architecture decisions, and best practices across GCP.
Design and enforce data governance, security, and access control frameworks.
Work closely with security teams to ensure data protection and compliance.
Lead client workshops to identify data sources, flows, and requirements.
Define future-state architectures, roadmaps, and implementation plans.
Collaborate with cross-functional teams (data scientists, engineers, business stakeholders) to deliver data solutions.
Evaluate and select tools, frameworks, and architectural patterns.
Provide technical leadership and mentorship to engineering teams.
Oversee performance optimization, scalability, and cost efficiency of data platforms.
Requirements
7+ years of experience in Data Engineering, Data Architecture, or Data Infrastructure roles.
3+ years of experience working with GCP or similar cloud platforms (AWS, Azure).
Proven experience in designing data architectures and large-scale data platforms.
Hands-on experience with Snowflake, including data modeling and performance optimization.
Strong expertise in data modeling, database design, and data architecture patterns.
Experience with ETL/ELT design, data integration, and migration from legacy systems.
Experience working with structured, semi-structured, and unstructured data.
Understanding of data governance, security, and compliance frameworks.
Experience with Infrastructure as Code (Terraform, Ansible, or similar).
Proficiency in SQL and good understanding of Python.
Strong analytical, problem-solving, and communication skills.
Experience working in a client-facing or consulting environment.
Benefits
Work in a supportive team of passionate enthusiasts of AI & Big Data.
Engage with top-tier global enterprises and cutting-edge startups on international projects.
Enjoy flexible work arrangements, allowing you to work remotely or from modern offices and coworking spaces.
Accelerate your professional growth through career paths , knowledge-sharing initiatives, language classes, and sponsored training or conferences , including a partnership with Databricks , which offers industry-leading training materials and certifications.
Choose your preferred form of cooperation: B2B or a contract of mandate, and make use of 20 fully paid days off.
Participate in team-building events and utilize the integration budget .
Celebrate work anniversaries, birthdays, and milestones.
Access medical and sports packages , eye care, and well-being support services, including psychotherapy and coaching.
Get full work equipment for optimal productivity, including a laptop and other necessary devices.
Experience a smooth onboarding with a dedicated buddy, and start your journey in our friendly, supportive, and autonomous culture.
Senior Data Engineer designing scalable data pipelines and supporting ML workloads at Trainline. Collaborating with cross - functional teams in a hybrid environment based in London.
Performance Data Engineer providing data modeling expertise and engineering a cloud - based Data Lakehouse platform. Support Federal agency ETL applications with ongoing development and maintenance responsibilities.
Manager for Data & AI team at Valorem Reply focused on modern data platforms with Microsoft technologies. Leading technical direction and collaborating with clients on data governance frameworks.
Manager of Data & AI team focused on building modern data platforms using Microsoft technologies. Collaborating with clients and team members for successful delivery and governance.
Senior Staff Data Engineer at DeepL working on enterprise - wide data engineering standards and cloud solutions. Leading technical initiatives and mentoring engineers to support data capabilities across the organization.
Data Engineer contributing to advanced analytics and machine learning solutions in aviation at Boeing. Collaborating within a data science team to produce industry - leading insights and build cloud - based tools.
Data Engineer designing and maintaining scalable data solutions in GCP and Snowflake environments. Collaborating with clients and stakeholders to ensure data quality and functionality.
Senior Data Engineer optimizing scalable data pipelines at Matrix for the Brazilian energy market. Involved in ETL processes and data architecture design with a collaborative team environment.
Implementation Engineer guiding customers in data architecture and integration solutions at MotherDuck. Collaborating with teams and leading technical discussions to ensure customer success.
Data Engineer/Senior Data Engineer developing scalable ETL/ELT pipelines and architecting data systems at Manulife. Collaborating with data professionals to ensure data quality and compliance.