About the role

  • Data Engineer at Betfair Romania Development responsible for building and maintaining scalable data pipelines. Collaborating with cross-functional teams to transform raw data into trusted datasets.

Responsibilities

  • Assist in building and maintaining batch and real-time data pipelines using modern data tools and cloud platforms.
  • Monitor internal dashboards and communication channels to help identify and escalate production issues.
  • Respond to support requests from data users related to data access, anomalies, and performance questions.
  • Support the resolution of data quality issues, working with senior engineers to identify root causes.
  • Help create and maintain documentation for pipelines, systems, and troubleshooting steps.
  • Collaborate with engineers, analysts, and technical project managers to ensure smooth data operations and successful delivery of features.
  • Ensure the reliability of data ingestion processes, respecting the governance principles defined by the organization.
  • Contribute to the quality and technological innovation of the product and the work environment.
  • Participate in team meetings and code reviews to learn engineering best practices and contribute to technical quality.
  • Sharing in-depth technical knowledge of Big Data and helping to train users in data solutions.
  • Promoting data quality and monitoring in the information environment.

Requirements

  • A passion for working with data and solving technical challenges.
  • Solid foundation in SQL, experience with at least one programming language (e.g., Python, Java, Scala), and familiarity with data transformation workflows using dbt.
  • Familiarity with data warehouses and/or cloud platforms such as AWS (bonus if you've used S3, Redshift, or similar tools).
  • Experience with ETL/ELT processes and data ingestion techniques.
  • Experience with Databricks, including Delta Live Tables (DLT) for building reliable and maintainable data pipelines.
  • Hands-on knowledge of Apache Spark and/or Spark Streaming for processing large-scale batch and real-time data.
  • Exposure to data orchestration tools (e.g., Airflow) or monitoring tools (e.g., Datadog).
  • Good communication and collaboration skills.
  • Curiosity and a growth mindset – you’re eager to develop your skills and try new tools.
  • Understanding of dimensional modeling or data testing frameworks
  • Familiarity with Git or CI/CD pipelines (e.g., GitHub Actions).

Benefits

  • Hybrid & remote working options
  • €1,000 per year for self-development
  • Company share scheme
  • 25 days of annual leave per year
  • 20 days per year to work abroad
  • 5 personal days/year
  • Flexible benefits: travel, sports, hobbies
  • Extended health, dental and travel insurances
  • Customized well-being programmes
  • Career growth sessions
  • Thousands of online courses through Udemy
  • A variety of engaging office events

Job title

Data Engineer

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job