Graph Data Engineer for Financial Crime platform at TymeX, designing graph data models and pipelines. Collaborating with teams to uncover fraud and optimize data performance.
Responsibilities
Design and implement **graph data models** that represent complex relationships between customers, organizations, transactions, devices, behaviour and events.
Build **graph data pipelines** to extract, transform, and load (ETL) data from multiple internal and external sources.
Develop efficient **queries and Graph Data Science algorithms** to support entity resolution, link analysis, and investigative insights.
Work with graph visualization tools to build intuitive investigation tools for investigators and analysts.
Collaborate with **Financial Crime/Fraud analysts** to translate investigation use cases into graph-driven solutions.
Optimize data performance and ensure data quality across graph structures.
Partner with software engineers and data scientists to integrate graph data into wider Financial Crime/Fraud workflows and detection logic.
Requirements
**Must-Have:**
3+ years of experience in **Data Engineering, Graph Data Modeling**, or related fields.
Hands-on experience with **graph databases** (e.g., Neo4j, TigerGraph, JanusGraph, Neptune or similar).
Proficiency in **Python or PySpark** for data manipulation and integration on Databricks.
Solid understanding of **data modeling, ETL, and query languages** (Cypher preferred).
Strong analytical mindset and problem-solving skills.
Good English communication skills (written and verbal).
Ability to work effectively in **cross-functional, international teams.**
**Nice to Have:**
Experience with graph visualization tools (Bloom, Linkurious etc)
Knowledge of **Entity Resolution, Fraud Detection, or AML / KYC** systems.
Experience integrating graph databases with** case management or Financial Crime/Fraud workflows.**
Familiarity with **cloud-based data platforms** (AWS).
Benefits
Join a **mission-driven product** tackling real-world financial crime challenges through data and technology.
Work with **modern graph technologies** and cutting-edge data platforms.
Collaborate with **global teams** across Vietnam, Philippines, Singapore, and South Africa.
Hybrid working model, flexible hours, and a diverse, inclusive environment.
Competitive salary and professional growth opportunities.
Data Engineer/Analyst maintaining and improving data infrastructure for Braiins. Collaborating with technical and business teams to ensure reliable data flows and insights.
Medior Data Engineer handling Azure migrations for a major urban mobility client. Focused on data pipeline development and ensuring platform reliability with cutting - edge technologies.
Developing ML and computer vision solutions for cutting - edge autonomous vehicle dataset pipeline at Mobileye. Collaborating across teams for data curation and advanced perception algorithms.
Data Migration Lead in a hybrid role managing data migration for a major transformation programme in the media sector. Collaborating with various teams to ensure data integrity and successful migration.
Consultant ML & DataOps at Smile integrating data science projects for major clients. Designing MLOps solutions and enhancing data governance in a collaborative environment.
Data Engineer developing and maintaining data pipelines for Coolbet’s analytical services. Working within an Agile framework to ensure data reliability and efficiency.
API Data Engineer developing innovative data - driven solutions and advancing data architecture for AI Control Tower. Building and integrating APIs and data pipelines to support organizational needs.
Journeyman Data Architect supporting Leidos' enterprise data and analytics program for the Department of War. Collaborating on solutions for data architecture, cloud environments, and governance.
Senior Software Engineer developing backend services and data infrastructure for integrated products at Booz Allen. Collaborating with a small elite team to deliver reliable and scalable services.
AWS Streaming Data Engineer developing software and systems in a fast, agile environment. Utilizing experience with real - time data ingestion and processing systems across distributed environments.