Data Engineer developing and maintaining scalable data pipelines for Kognia Sports Intelligence. Collaborating with engineers and data scientists to support data-driven decision making.
Responsibilities
Design, build and support modern and scalable data pipelines using data processing frameworks, technologies, and platforms
Use best practices around CI/CD, automation, testing, and monitoring of analytics pipelines
Collaborate with software engineers, researchers, data scientists, and stakeholders to understand what data is required and how best to make it available in our platform
Improve our cloud architecture and design new architectures as the need arises
Identify and address possibilities for improvement in areas such as speed of delivery and infrastructure cost reduction
Investigate new technologies and approaches as needed
Willingness to take part in an on-call rotation with your team members
Requirements
Minimum 1 year in a similar position or 3 years in other engineering roles with relevant responsibilities
Fluent with one or more high-level programming languages (Python preferred, but also Ruby, Java, Scala, Go, or similar)
Willing to work mostly in Python but possibility for other stacks as the team may decide on a service-by-service basis
Experience working with SaaS production architectures in GCP (preferred) or AWS
Ability to adapt to a fast-paced, changing agile environment
Interest (if not experience) in DevOps technologies such as Kubernetes
Excellent team player with strong verbal and written communication skills
Comfortable working in English - we’re an international team based in Barcelona, with English as our shared language.
Experience providing data and infrastructure for building and deploying ML models to production (preferred)
Experience working in multi-functional teams with end-to-end responsibility for product development and delivery within your mission (preferred)
Front-end experience in React (preferred)
Interested in being the glue between engineering and research (preferred)
Experience in data quality and governance (preferred)
Specific knowledge of GitLab CI/CD (preferred)
Knowledge of containerization, GitOps, and Linux (preferred)
Kubernetes experience especially is a big plus (preferred)
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.