Data Engineer building and maintaining data pipelines for Farfetch’s data platform. Collaborating with the Data team to improve data reliability and architecture in Porto.
Responsibilities
Contribute to the hiring and training of engineers within the managed team
Assure the collaboration of managed team across different engineering teams (inside and outside of the domain) and adherence to the defined global engineering best practices
Design and build scalable & reliable data pipelines (ETLs) for our data platform
Constantly evolve data models & schema design of our Data Warehouse to support self-service needs
Work cross functionally with various teams, creating solutions that deal with large volumes of data
Work with the team to set and maintain standards and development practices
Be a keen advocate of quality and continuous improvement.
Requirements
You have experience as a professional with solid technical background building and maintaining data pipelines in a custom or commercial ETL tool (eg. SSIS, Talend, Informatica) (Airflow is a plus)
You have experience working in a Data Warehouse environment with varied forms of data infrastructure, including relational databases, Hadoop, and Column Store
Expert in creating and evolving dimensional data models & schema designs to improve accessibility of data and provide intuitive analytics
Basic experience and knowledge of cloud environments (eg. AWS, GCP, Azure)
Expert in SQL
You are proficient in one of the following programming languages: C#, Java, Python
You have 2+ years experience working with a BI reporting tool (eg. Tableau, QlikView, PowerBI, Looker)
Proficient applier of continuous delivery principles: version control, unit and automated tests
You are fluent in English, both written and spoken
You have good analytical and problem solving skills, the ability to work in a fast moving operational environment and you are enthusiastic and with a positive attitude.
Benefits
Health insurance for the whole family, flexible working environment and well-being support and tools
Extra days off, sabbatical program and days for you to give back for the community
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.