Data Analytics Engineer responsible for designing scalable data pipelines at Ferryhopper. Focused on empowering data-driven decisions and optimizing data collection processes.
Responsibilities
The roleAs a Data Analytics Engineer, the main purpose is to leverage data in order to empower the organization to unlock value from multiple data sources.
You will contribute to the design of scalable data pipelines to support Ferryhopper’s growing data processing and analytics needs. To achieve this, your day-to-day tasks span from designing and creating data pipelines to collecting, processing, and storing large volumes of data from various sources to create clean, reliable datasets for reporting and analytics. You’ll need to be highly motivated, biased for action, have a problem-solving mindset, curiosity, and be passionate about data and enabling data-driven decisions.
Responsibilities
Contribute to the design, build, and maintenance of data pipelines and workflows to ensure reliable and high-quality data across Ferryhopper’s data sources, helping to develop clean ingestion and transformation processes.
Assist in orchestration and monitoring of data flows to guarantee data integrity and freshness across the BI ecosystem.
Support development and maintenance of data quality checks, including audits, validation logic, and anomaly detection to ensure consistent and reliable data.
Continuously optimize data collection and reporting workflows, automating manual processes and improving efficiency wherever possible.
Enable analytics and reporting by delivering well-structured, foundational datasets that support dashboards, reporting, and modeling across the business.
Perform light data exploration and data serving activities to support analytical and business teams in understanding and utilizing available data.
Requirements
2-3 years of experience as a Data Engineer, Analytics Engineer, or in a similar data-focused role.
Hands-on experience building analytical models and data pipelines that support scalable reporting and analytics, using modern transformation frameworks such as dbt.
Hands-on experience with data ingestion tools, such as dlt, Airbyte, Google Datastream.
Strong programming skills in SQL and Python.
Familiarity with modern cloud data platforms such as AWS or Google Cloud, with a focus on data storage and compute resources.
Strong communication and collaboration skills.
Excellent problem-solving and analytical skills.
Ability to work independently, manage multiple projects and priorities simultaneously, and meet deadlines with a high degree of accuracy and attention to detail.
A life-long learner who is curious, has a passion for solving hard, ill-defined problems, has comfort taking initiative and who continuously seeks to improve their skills and understanding. Enjoys working in a team environment as well as independently.
Nice to have:
Knowledge of coding best practices and software engineering principles, including version control, code review, testing, and documentation.
Familiarity with agile development methodologies, including scrum, Kanban, and agile project management tools.
Experience with orchestration and automation tools and frameworks (e.g., Dagster, Airflow) to schedule, monitor, build , and manage data workflows.
Tech stack:
SQL, Python, BigQuery, dbt, Google Cloud, AWS, S3, Lambda Functions, Power BI, Looker Studio
Benefits
The health of our company and the success of our products is directly related to the health of our team and the work environment we create for ourselves. With this in mind, we strive to provide an inclusive and positive working environment. In this respect, we offer:
A competitive compensation package
Equipment of your choice
Training and educational budget throughout the year
Joining a fast-growing ambitious international team
Fun team events and a vibrant company culture
Flexible working policy
***Remote policy: For teams located in Athens, the policy is to visit the office a minimum of once per week.***
***There are six weeks per year in which you can work from anywhere without visiting the office.***
Senior Analytics Engineer designing scalable data models and optimizing data pipelines at Klar. Collaborating with cross - functional teams to ensure data quality and mentorship of business analysts.
Analytics Engineer Associate at Teragonia designing and developing advanced analytics for private equity clients. Collaborating closely with clients and team to deliver tailored solutions.
Engagement Manager leading a team of analytics engineers to develop advanced analytics solutions for private equity clients in Toronto. Driving value creation through innovative insights and strategic analytics.
Senior Analytics Engineer developing scalable analytics models for real - world health data. Leading strategic initiatives and mentoring teams in a hybrid environment in London.
Analytics Engineer making data an operational tool at Newsec's Göteborg office. Strengthening the use of data and collaborating closely with Product Managers.
Join knowmad mood as a Data Engineer / Analytics Engineer focusing on data processing for hotel business analytics. Collaborate within a mature data team on technological modernization strategies.
Senior Analytics Engineer at Simply Business designing data models and building infrastructure for AI and ML enhancement. Collaborating with engineering and product teams to integrate new data sources.
Data Analytics Engineer focused on building ETL pipelines and data models at Prodigy Finance. Working with AWS Redshift, Python, and SQL in a hybrid team.
Data Analytics Engineer driving innovative data solutions at Northern Trust. Collaborating on architecture, client satisfaction, and technology transformations with a global team in a hybrid role.
Analytics Engineer transforming raw data into actionable insights to support decision - making in data marketing. Working with various data technologies in a hybrid internship role in Paris.