Data Engineer at Betfair Romania Development responsible for building and maintaining scalable data pipelines. Collaborating with cross-functional teams to transform raw data into trusted datasets.
Responsibilities
Assist in building and maintaining batch and real-time data pipelines using modern data tools and cloud platforms.
Monitor internal dashboards and communication channels to help identify and escalate production issues.
Respond to support requests from data users related to data access, anomalies, and performance questions.
Support the resolution of data quality issues, working with senior engineers to identify root causes.
Help create and maintain documentation for pipelines, systems, and troubleshooting steps.
Collaborate with engineers, analysts, and technical project managers to ensure smooth data operations and successful delivery of features.
Ensure the reliability of data ingestion processes, respecting the governance principles defined by the organization.
Contribute to the quality and technological innovation of the product and the work environment.
Participate in team meetings and code reviews to learn engineering best practices and contribute to technical quality.
Sharing in-depth technical knowledge of Big Data and helping to train users in data solutions.
Promoting data quality and monitoring in the information environment.
Requirements
A passion for working with data and solving technical challenges.
Solid foundation in SQL, experience with at least one programming language (e.g., Python, Java, Scala), and familiarity with data transformation workflows using dbt.
Familiarity with data warehouses and/or cloud platforms such as AWS (bonus if you've used S3, Redshift, or similar tools).
Experience with ETL/ELT processes and data ingestion techniques.
Experience with Databricks, including Delta Live Tables (DLT) for building reliable and maintainable data pipelines.
Hands-on knowledge of Apache Spark and/or Spark Streaming for processing large-scale batch and real-time data.
Exposure to data orchestration tools (e.g., Airflow) or monitoring tools (e.g., Datadog).
Good communication and collaboration skills.
Curiosity and a growth mindset – you’re eager to develop your skills and try new tools.
Understanding of dimensional modeling or data testing frameworks
Familiarity with Git or CI/CD pipelines (e.g., GitHub Actions).
Data Engineer designing and optimizing Azure - based data platforms for enterprise analytics. Developing scalable data pipelines and enabling insights through Power BI and Azure Synapse Analytics.
Senior Software Engineer focused on ingestion pipeline at Fullstory. Engineering distributed systems for processing data at scale while collaborating with technical leaders.
Junior Data Engineer contributing to data solutions in home24's Martech team. Focus on data pipelines, analytical workflows, and machine learning model scaling with cross - functional collaboration.
Data Engineer at Onepoint developing cloud - native architectures and scalable data solutions. Collaborating on data processing pipelines and guiding clients on best practices.
Data Engineer at Onepoint contributing to client growth through cloud technologies. Involves data pipeline development, auditing cloud configurations, and supporting data science practices.
Senior Data Engineer / Data Scientist developing AI - driven solutions at GFT. Focus on scalable data pipelines, AI/ML models, and LLM technologies while collaborating with UK banking clients.
AI & BI Data Engineer developing analytics solutions to enhance genomic platforms at Corteva Agriscience. Focused on data pipelines, AI/ML model operation, and decision - making analytics.
Senior Data Engineer optimizing data pipelines for AI solutions developed for clients. Working on data architecture and implementing machine learning models for scalable environments.
Data Engineer developing cloud migration and data solutions for retail at Public Group. Engage in multiple projects and create growth opportunities in a hybrid team environment.