About the role

  • Design, structure and optimize a modern Data platform primarily based on Databricks
  • Ensure the platform’s scalability, performance, security and governance
  • Support teams in adopting Databricks tools and DataOps best practices
  • Integrate and optimize interactions between Databricks and Snowflake
  • Participate in technical discussions and international workshops (in English)
  • Design and document the overall data architecture around Databricks (Delta Lake, Unity Catalog, MLflow…)
  • Define data models, pipelines and integration strategies (batch & streaming)
  • Oversee development best practices, automation and CI/CD
  • Collaborate with Cloud teams (Azure / AWS / GCP depending on context)
  • Communicate technical information in English with global teams
  • Produce architecture deliverables, documentation and knowledge transfer materials

Requirements

  • Databricks: excellent command (Spark, PySpark, Delta Lake, notebooks, Unity Catalog, orchestration)
  • Snowflake: strong knowledge of modeling, ingestion and cost management
  • Cloud: expertise in one of the major providers (Azure, AWS or GCP)
  • Languages: Python, SQL, Spark
  • DataOps / CI-CD: Git, Azure DevOps, Terraform or equivalents
  • Architecture: principles of Data Lakehouse, data governance, security, performance and scalability
  • English: professional proficiency required (meetings, documentation, international collaboration)
  • Soft skills: technical leadership and a broad vision of the data ecosystem
  • Excellent communication skills (in English and French)

Benefits

  • 2 days per week remote - ideally 3 days on-site in Bordeaux

Job title

Data Architect

Job type

Experience level

Mid levelSenior

Salary

€650 per day

Degree requirement

No Education Requirement

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job