Hybrid Principal Data Platform Engineer

Posted last week

Apply now

About the role

  • Principal Data Platform Engineer designing and building cloud-native data platforms at Simple Machines. Leading technical direction and high-impact decisions to solve complex data challenges.

Responsibilities

  • Own the end-to-end architecture of modern, cloud-native data platforms
  • Design scalable data ecosystems using **data mesh, data products, and data contracts**
  • Make high-impact architectural decisions across ingestion, storage, processing, and access layers
  • Ensure platforms are secure, compliant, and production-grade by design
  • Design and deliver cloud-native data platforms using **Databricks, Snowflake, AWS, and GCP**
  • Apply modern architectural patterns: **data mesh, data products, and data contracts**
  • Integrate deeply with client systems to enable scalable, consumer-oriented data access
  • Build and optimise **batch and real-time pipelines**
  • Work with streaming and event-driven tech such as **Kafka, Flink, Kinesis, Pub/Sub**
  • Orchestrate workflows using **Airflow, Dataflow, Glue**
  • Process and transform large datasets using **Spark and Flink**
  • Work across relational, NoSQL, and analytical stores (Postgres, BigQuery, Snowflake, Cassandra, MongoDB)
  • Optimise storage formats and access patterns (Parquet, Delta, ORC, Avro)
  • Implement secure, compliant data solutions with **security by design**
  • Embed governance without killing developer velocity
  • Translate business needs into pragmatic engineering decisions
  • Act as a trusted technical advisor, not just an order taker
  • Set engineering standards, patterns, and best practices across teams
  • Review designs and code, providing clear technical direction and mentorship
  • Raise the bar on data quality, testing, observability, and operational excellence

Requirements

  • Strong **Python and SQL**
  • Deep experience with **Spark** and modern data platforms (Databricks / Snowflake)
  • Solid grasp of cloud data services (AWS or GCP)
  • Demonstrated ownership of large-scale data platform architectures
  • Strong data modelling skills and architectural decision-making ability
  • Comfortable balancing trade-offs between performance, cost, and complexity
  • Built and operated **large-scale data pipelines** in production
  • Strong data modelling capability and architectural judgement
  • Comfortable with multiple storage technologies and formats
  • Infrastructure-as-code experience (**Terraform, Pulumi**)
  • CI/CD pipelines using tools like **GitHub Actions, ArgoCD**
  • Data testing and quality frameworks (**dbt, Great Expectations, Soda**)
  • Experience in consulting or professional services environments
  • Strong consulting instincts — able to challenge assumptions and guide clients toward better outcomes
  • Comfortable mentoring senior engineers and influencing technical culture

Benefits

  • You’ll work on **interesting, high-impact problems**
  • You’ll build **modern platforms**, not maintain legacy mess
  • You’ll be surrounded by senior engineers who actually know their craft
  • You’ll have autonomy, influence, and room to grow

Job title

Principal Data Platform Engineer

Job type

Experience level

Lead

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job