Data Engineer developing core algorithms and back-end pipelines for cargo tracking at Kpler. Collaborating with data engineers and scientists to optimize system performance and data quality.
Responsibilities
Working alongside data engineers, data scientists and product managers, take responsibility for the development and implementation of our core algorithms and back-end data pipelines, based on project requirements and design specifications.
Help clients and internal users benefit from the highest cargo tracking data quality by adding new features and reviewing pipelines and integrations with our datastores.
Help to optimise system performance, maintain features, troubleshoot issues, and ensure high availability.
Demonstrate strong analytical and debugging skills with a proactive approach to learning.
Requirements
Have hands-on experience with **Python**. Experience with **Flask** or **SQLALchemy** is a plus.
Solid **SQL** skills for querying and managing relational databases.
Knowledge of streaming and big data technologies (such as** Kafka or Spark**).
Comfortable working with **Git**, code reviews, and **Agile methodologies**.
Have an understanding of containerisation and orchestration tools (e.g., **Docker, Kubernetes**).
Are eager to learn new languages and technologies.
Have worked with **AWS **(or another cloud provider), using **Terraform**.
Have experience with Scala or other **JVM Languages**.
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.