Data Engineer developing and maintaining scalable data pipelines for Kognia Sports Intelligence. Collaborating with engineers and data scientists to support data-driven decision making.
Responsibilities
Design, build and support modern and scalable data pipelines using data processing frameworks, technologies, and platforms
Use best practices around CI/CD, automation, testing, and monitoring of analytics pipelines
Collaborate with software engineers, researchers, data scientists, and stakeholders to understand what data is required and how best to make it available in our platform
Improve our cloud architecture and design new architectures as the need arises
Identify and address possibilities for improvement in areas such as speed of delivery and infrastructure cost reduction
Investigate new technologies and approaches as needed
Willingness to take part in an on-call rotation with your team members
Requirements
Minimum 1 year in a similar position or 3 years in other engineering roles with relevant responsibilities
Fluent with one or more high-level programming languages (Python preferred, but also Ruby, Java, Scala, Go, or similar)
Willing to work mostly in Python but possibility for other stacks as the team may decide on a service-by-service basis
Experience working with SaaS production architectures in GCP (preferred) or AWS
Ability to adapt to a fast-paced, changing agile environment
Interest (if not experience) in DevOps technologies such as Kubernetes
Excellent team player with strong verbal and written communication skills
Comfortable working in English - we’re an international team based in Barcelona, with English as our shared language.
Experience providing data and infrastructure for building and deploying ML models to production (preferred)
Experience working in multi-functional teams with end-to-end responsibility for product development and delivery within your mission (preferred)
Front-end experience in React (preferred)
Interested in being the glue between engineering and research (preferred)
Experience in data quality and governance (preferred)
Specific knowledge of GitLab CI/CD (preferred)
Knowledge of containerization, GitOps, and Linux (preferred)
Kubernetes experience especially is a big plus (preferred)
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.
Senior Data Engineer designing and implementing the Enterprise Data Platform at Stellix. Focusing on analytics and insights with a growth path to Principal Data Engineer or Data Architect.
R&D Data Engineer at DXC, transforming complex data into digital assets for global analytics and Smart Lab solutions. Collaborating on ELN and LIMS tools for enhanced data management.
Data Engineer role focusing on data pipelines and processing at 42dot, a mobility AI company. Responsibilities include data collection, schema management, and pipeline monitoring.
Senior Data Engineer at mobility AI company designing large - scale data processing pipelines. Leading technical decisions and mentoring junior engineers in data architecture.
Senior Data Engineer at Booz Allen building advanced tech solutions for mission - driven projects. Utilizing data engineering activities, pipelines, and platforms for impactful data insights.
Senior Software Engineer contributing to Workday's AI/MLOps cloud ops platform. Involves data ingestion, computation, and generation of curated data sets with modern technologies.