Data Analytics Engineer responsible for designing scalable data pipelines at Ferryhopper. Focused on empowering data-driven decisions and optimizing data collection processes.
Responsibilities
The roleAs a Data Analytics Engineer, the main purpose is to leverage data in order to empower the organization to unlock value from multiple data sources.
You will contribute to the design of scalable data pipelines to support Ferryhopper’s growing data processing and analytics needs. To achieve this, your day-to-day tasks span from designing and creating data pipelines to collecting, processing, and storing large volumes of data from various sources to create clean, reliable datasets for reporting and analytics. You’ll need to be highly motivated, biased for action, have a problem-solving mindset, curiosity, and be passionate about data and enabling data-driven decisions.
Responsibilities
Contribute to the design, build, and maintenance of data pipelines and workflows to ensure reliable and high-quality data across Ferryhopper’s data sources, helping to develop clean ingestion and transformation processes.
Assist in orchestration and monitoring of data flows to guarantee data integrity and freshness across the BI ecosystem.
Support development and maintenance of data quality checks, including audits, validation logic, and anomaly detection to ensure consistent and reliable data.
Continuously optimize data collection and reporting workflows, automating manual processes and improving efficiency wherever possible.
Enable analytics and reporting by delivering well-structured, foundational datasets that support dashboards, reporting, and modeling across the business.
Perform light data exploration and data serving activities to support analytical and business teams in understanding and utilizing available data.
Requirements
2-3 years of experience as a Data Engineer, Analytics Engineer, or in a similar data-focused role.
Hands-on experience building analytical models and data pipelines that support scalable reporting and analytics, using modern transformation frameworks such as dbt.
Hands-on experience with data ingestion tools, such as dlt, Airbyte, Google Datastream.
Strong programming skills in SQL and Python.
Familiarity with modern cloud data platforms such as AWS or Google Cloud, with a focus on data storage and compute resources.
Strong communication and collaboration skills.
Excellent problem-solving and analytical skills.
Ability to work independently, manage multiple projects and priorities simultaneously, and meet deadlines with a high degree of accuracy and attention to detail.
A life-long learner who is curious, has a passion for solving hard, ill-defined problems, has comfort taking initiative and who continuously seeks to improve their skills and understanding. Enjoys working in a team environment as well as independently.
Nice to have:
Knowledge of coding best practices and software engineering principles, including version control, code review, testing, and documentation.
Familiarity with agile development methodologies, including scrum, Kanban, and agile project management tools.
Experience with orchestration and automation tools and frameworks (e.g., Dagster, Airflow) to schedule, monitor, build , and manage data workflows.
Tech stack:
SQL, Python, BigQuery, dbt, Google Cloud, AWS, S3, Lambda Functions, Power BI, Looker Studio
Benefits
The health of our company and the success of our products is directly related to the health of our team and the work environment we create for ourselves. With this in mind, we strive to provide an inclusive and positive working environment. In this respect, we offer:
A competitive compensation package
Equipment of your choice
Training and educational budget throughout the year
Joining a fast-growing ambitious international team
Fun team events and a vibrant company culture
Flexible working policy
***Remote policy: For teams located in Athens, the policy is to visit the office a minimum of once per week.***
***There are six weeks per year in which you can work from anywhere without visiting the office.***
Analytics Engineer at UK's premier publisher assisting data - driven insights for customer understanding. Bridging gap between data engineering and analytics with clean, reliable data access.
Analytics Engineer designing and developing software solutions for complex data at OCLC. Collaborating with teams to enhance data access and analytics capabilities.
Analytics Engineer enabling data - driven decisions across clinical and operational functions at QualDerm Partners. Responsible for data collection, analysis, visualization, and stakeholder collaboration.
Lead Digital Analytics Engineer driving modernization of site analytics and data layer for multi - brand ecommerce. Architecting a unified measurement foundation and collaborating with various stakeholders across the organization.
SAP Analytics Developer managing and delivering SAP BW/BI solutions for Niterra. Collaborating with stakeholders to optimize business intelligence and data warehousing capabilities.
Analytics Engineer responsible for data consolidation and maintenance, supporting reporting and dashboards at Brilliant Corners for vulnerable populations.
ETL Developer maintaining data pipelines and supporting data quality for IMT Insurance. Collaborating with the data team and learning ETL tools under guidance.
Analytics Engineer creating scalable data models to support multiple products at Marshmallow. Collaborating across teams to enhance data - driven decision - making and optimize product offerings.
Analytics Engineer working in fintech, focusing on data modeling, SQL, and Power BI for strategic decisions. Engaging in high transaction volume environments with hybrid workspace setup.
Data Engineer transforming raw data into well - modelled datasets for decision - making at Blink. Collaborating with BI team to optimize and maintain data infrastructure.