About the role

  • Data Engineer designing and maintaining scalable ETL pipelines at Satori Analytics. Collaborating with teams to deliver high-quality analytics solutions across various industries.

Responsibilities

  • **What Your Day Might Look Like:**
  • Independently design and maintain scalable ETL pipelines within a collaborative project team, delivering clean, analytics- and AI-ready data.
  • Work with SQL, Python, PySpark and tools like MS Fabric, Azure Data Factory, Databricks, Snowflake to develop, optimize, and automate data processes.
  • Design robust ETL, scalable data models and optimised design patterns for analytics and BI workloads.
  • Ensure data quality and data governance through automated checks, monitoring, source to target mapping, data lineage and continuous improvements.
  • Collaborate with cross-functional teams to understand data requirements, business semantics and deliver high-quality solutions.
  • Troubleshoot, optimize, and support production pipelines to keep data flowing smoothly.
  • Use Git and Agile practices to work effectively in collaborative, iterative projects.

Requirements

  • **Your Superpowers 🚀**
  • BSc or MSc in Computer Science, Engineering, or similar.
  • Strong SQL, Python and/or PySpark skills.
  • Solid professional experience developing ETL pipelines (e.g. Azure Data Factory, Databricks, etc.) and modern data warehouses (e.g. MS Fabric or Databricks delta lakehouse, Snowflake, etc).
  • Solid professional experience working with relational and NoSQL databases and systems (MS SQL Server, PostgresSQL, MongoDB, etc).
  • Strong understanding of data modelling and design patterns (star-schema, data vault, SCD).
  • Basic knowledge of cloud platforms (Azure, AWS, or GCP).
  • Basic knowledge of visualization tools (Power BI, Tableau, Looker, etc.).
  • Understanding of Agile practices and version control systems (GitHub, Azure DevOps).
  • Strong problem-solving skills, eagerness to learn, collaborative spirit, customer facing skills.
  • Fluent in English, both written and spoken.
  • **Bonus Points For:**
  • 3+ years’ experience in hands-on data engineering.
  • Understanding of AI concepts and architectures.
  • Experience with enterprise platforms like Salesforce, SAP or Entersoft.
  • Familiarity with no-code/low-code ETL tools such as Airflow, dbt, Matillion, Fivetran, etc.
  • Advanced knowledge of PowerBI, Tableau, Qlik, etc.
  • Exposure to Java or Scala, and OO/functional programming concepts.

Benefits

  • **Perks on Perks:**
  • Competitive salary and hybrid work model – come hang out in our Athens office or work remotely from anywhere in European economic Area (EU, Switzerland etc.) or UK (up to 6 weeks per year).
  • Training budget to level up your skills from the top tech partners in the market (Microsoft, AWS, Salesforce, Databricks etc.) – whether it’s certifications or courses, we’ve got you covered.
  • Private insurance, top-tier tech gear, and the chance to work with a stellar crew.

Job title

Data Engineer

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job