Data Engineer building and maintaining scalable data pipelines for Flowcode using modern technologies. Collaborating with cross-functional teams to ensure efficient data flow and infrastructure optimization.
Responsibilities
Design, develop, and maintain data pipelines for ingesting, processing, and transforming large datasets using Python.
Leverage Snowflake to build and manage robust data warehouses, ensuring efficient storage and retrieval of data.
Build, test, and maintain modular analytics models using DBT to ensure documented, versioned, and reliable transformations in Snowflake.
Implement ETL workflows using Python, Fivetran and Amazon DMS, ensuring data integrity, quality, and timely delivery.
Develop and integrate APIs to facilitate seamless data exchange between internal systems and external data sources.
Implement data ingestion solutions from streaming platforms, including Kafka, to handle real-time data processing.
Use Apache Airflow to schedule and monitor data workflows, ensuring consistent and reliable pipeline execution.
Deploy and manage applications using Docker and Kubernetes to create scalable, containerized solutions.
Continuously monitor, optimize, and troubleshoot data processes and infrastructure for performance improvements and reliability.
Requirements
3+ years of experience in data engineering or software development
Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or a related field, or equivalent practical experience in data engineering.
Proficient in Python.
Hands-on experience with Snowflake, Fivetran, DBT, and Amazon DMS.
Experience with RESTful/GraphQL APIs and Kafka.
Skilled with Apache Airflow, Docker, and Kubernetes.
Strong problem-solving abilities, with attention to detail and a focus on data quality.
Proven experience working collaboratively in agile, cross-functional teams.
Software Engineer developing and operating data systems at Todyl, enhancing data ingestion, storage, and processing. Collaborating across teams to integrate data platforms with services.
Data Engineer developing and transforming data pipelines on GCP for a global dairy brand. Ensuring data quality and integrity with CI/CD practices and documentation.
Data Engineer role at SFEIR in Luxembourg focusing on building scalable data pipelines and AI architectures. Transforming data into valuable insights using modern technologies and frameworks.
Senior Lead Data Engineer managing ETL pipelines and architecture design for Novacore data platforms. Collaborating with cross - functional teams to ensure quality standards and business alignment.
Data Engineer designing and implementing data ingestion systems for IT consulting firm. Involves managing data architectures and developing ETL/ELT pipelines using modern tools and frameworks.
Data Engineer managing complex financial modeling and reporting data for Financial Services organization. Collaborating with Data Analysts on cutting - edge financial reports.
Own the insights data platform at BR - DGE, a fintech focused on scalable payment infrastructures. Collaborate with Analytics and Engineering to deliver trusted datasets and timely signals.
Data Engineering Lead designing and optimizing scalable data pipelines for a property insurance company. Collaborating across teams, mentoring engineers, and driving data architecture initiatives.
Senior Advisor managing data architecture and modeling frameworks for iA Financial Group. Supporting transformation and innovation of data management practices within the organization.
Senior Data Engineer leading the design and implementation of enterprise data strategies for Kemper Insurance's analytics and business intelligence initiatives.