Data Analytics Engineer responsible for designing scalable data pipelines at Ferryhopper. Focused on empowering data-driven decisions and optimizing data collection processes.
Responsibilities
The roleAs a Data Analytics Engineer, the main purpose is to leverage data in order to empower the organization to unlock value from multiple data sources.
You will contribute to the design of scalable data pipelines to support Ferryhopper’s growing data processing and analytics needs. To achieve this, your day-to-day tasks span from designing and creating data pipelines to collecting, processing, and storing large volumes of data from various sources to create clean, reliable datasets for reporting and analytics. You’ll need to be highly motivated, biased for action, have a problem-solving mindset, curiosity, and be passionate about data and enabling data-driven decisions.
Responsibilities
Contribute to the design, build, and maintenance of data pipelines and workflows to ensure reliable and high-quality data across Ferryhopper’s data sources, helping to develop clean ingestion and transformation processes.
Assist in orchestration and monitoring of data flows to guarantee data integrity and freshness across the BI ecosystem.
Support development and maintenance of data quality checks, including audits, validation logic, and anomaly detection to ensure consistent and reliable data.
Continuously optimize data collection and reporting workflows, automating manual processes and improving efficiency wherever possible.
Enable analytics and reporting by delivering well-structured, foundational datasets that support dashboards, reporting, and modeling across the business.
Perform light data exploration and data serving activities to support analytical and business teams in understanding and utilizing available data.
Requirements
2-3 years of experience as a Data Engineer, Analytics Engineer, or in a similar data-focused role.
Hands-on experience building analytical models and data pipelines that support scalable reporting and analytics, using modern transformation frameworks such as dbt.
Hands-on experience with data ingestion tools, such as dlt, Airbyte, Google Datastream.
Strong programming skills in SQL and Python.
Familiarity with modern cloud data platforms such as AWS or Google Cloud, with a focus on data storage and compute resources.
Strong communication and collaboration skills.
Excellent problem-solving and analytical skills.
Ability to work independently, manage multiple projects and priorities simultaneously, and meet deadlines with a high degree of accuracy and attention to detail.
A life-long learner who is curious, has a passion for solving hard, ill-defined problems, has comfort taking initiative and who continuously seeks to improve their skills and understanding. Enjoys working in a team environment as well as independently.
Nice to have:
Knowledge of coding best practices and software engineering principles, including version control, code review, testing, and documentation.
Familiarity with agile development methodologies, including scrum, Kanban, and agile project management tools.
Experience with orchestration and automation tools and frameworks (e.g., Dagster, Airflow) to schedule, monitor, build , and manage data workflows.
Tech stack:
SQL, Python, BigQuery, dbt, Google Cloud, AWS, S3, Lambda Functions, Power BI, Looker Studio
Benefits
The health of our company and the success of our products is directly related to the health of our team and the work environment we create for ourselves. With this in mind, we strive to provide an inclusive and positive working environment. In this respect, we offer:
A competitive compensation package
Equipment of your choice
Training and educational budget throughout the year
Joining a fast-growing ambitious international team
Fun team events and a vibrant company culture
Flexible working policy
***Remote policy: For teams located in Athens, the policy is to visit the office a minimum of once per week.***
***There are six weeks per year in which you can work from anywhere without visiting the office.***
Data Engineer at Regions Bank focusing on designing and maintaining data structures for analytics. Collaborating with technical teams to deliver data - driven business value and products.
Analytics Engineer shaping Statista's reporting platform for data - driven decision - making. Focusing on BI tools like Power BI and improving report usability and transparency.
Director of Data & Analytics leading data strategy for Reach Financial, enhancing decision - making through a data - driven culture. Guiding a team to innovate and implement scalable data solutions.
Data Scientist creating scalable insights from unstructured data at AI safety company. Collaborating with engineering and research teams in a hybrid Paris location.
Senior Analytics Engineer owning the analytics data platform for parenting technology startup. Requires deep SQL expertise and data pipeline experience in a hybrid role.
Senior Analytics Engineer at Higgsfield AI translating product and finance metrics into data models. Collaborating cross - functionally to ensure consistent, reliable data for decision - making.
Senior Data Analytics Developer for Krux, building data infrastructure for innovative SaaS solutions in mining industry. Collaborating with a self - motivated team to drive company growth.
Lead Risk Analytics Consultant at Wells Fargo focused on model governance and risk management strategies. Collaborate across teams to enhance system stability and mentor junior staff.
Analytics Engineering Lead responsible for building data products at Sanlam Fintech. Overseeing analytics engineering practice modernization and talent development within the organization.
Analytics Engineer foundational technical pillar for Analytics & Data Engineering at Skin + Me, transforming raw data into a strategic engine for growth. Reporting to the Director of Data and optimizing performance across business units.