Data Architect defining, evolving, and governing data models for capital markets trading and finance at a leading global firm. Engaging with senior stakeholders in a hybrid role based in London.
Responsibilities
Define, evolve, and govern enterprise canonical data models for capital markets across the full trade lifecycle (front-to-back): order, execution, trade capture, confirmation, clearing, settlement, lifecycle events, collateral/margin, accounting impacts.
Model complex products and lifecycle state (e.g., rates/FX/credit/eq derivatives, securities financing), with clear semantics, identifiers, lineage, and versioning.
Apply and map industry standards:
FpML: interpret existing trade representation/messaging and define mappings into internal canonical schemas
ISDA CDM: model lifecycle events/states/workflows and product CDM to canonical to platform mapping specifications
Lead modelling for ingestion and harmonisation of:
Market data (curves, surfaces, fixings, reference data, identifiers)
Trade data (allocations, amendments, novations, compressions, terminations, resets)
Order data (order lifecycle, fills, venues, timestamps, audit trail)
Partner with Front Office, Finance, Market Risk, and Credit Risk to ensure models support valuation inputs, sensitivities, exposure/netting/collateral, P&L explain, reconciliations, and reporting needs.
Translate logical models into implementation-ready artefacts: schemas, API contracts, event/topic models, validation rules, transformation specifications, and data-quality rules.
Establish model governance practices: naming standards, business glossary, metadata, stewardship model, change control, documentation, and versioning.
Provide architectural input into target-state data platforms and integration patterns (e.g., event-driven, lakehouse/warehouse), and mentor engineers/analysts on model adoption.
Requirements
Proven data modeller experience in capital markets, ideally within top-tier banks or equivalent institutional environments.
Strong front-to-back understanding with FpML and ISDA CDM (interpretation, application to real workflows, and mapping into internal canonical models).
Strong knowledge of market data, trade data, and order data domains and their interrelationships.
Background as an object-oriented software developer (e.g., Java/C#/C++), able to bridge modelling and engineering implementation.
Certifications such as TOGAF, DAMA/CDMP, and/or cloud certifications (Azure/AWS/GCP equivalents).
Experience implementing models into streaming and data platform ecosystems (e.g., Kafka/event hubs, lakehouse/warehouse) and establishing enterprise governance practices.
Exposure to regulatory/reporting data domains and cross-system reconciliation frameworks.
Senior Data Engineer at Keyrus leading the design, development, and delivery of scalable data platforms. Collaborating with teams to translate requirements into production - grade solutions and mentoring engineers.
Senior Data Engineer for global payments platform designing ETL pipelines and data models. Collaborating across teams to tackle complex data challenges in an innovative fintech environment.
Data Warehouse Modelling Engineer designing and maintaining data models using Data Vault 2.0 for iGaming industry. Collaborating with stakeholders and optimizing data models in a hybrid work environment.
Senior Data Engineer driving impactful data solutions for the climate logistics startup HIVED's core data platform. Collaborating with cross - functional squads to enhance analytics and delivery.
Data Engineer developing and maintaining CRE forecasting infrastructure for Cushman & Wakefield. Collaborates with senior economists and technical teams to ensure high - quality data solutions.
Data Engineer at PwC, engaging with Azure cloud services to enhance data handling and integrity. Responsibilities include pipeline optimizations, documentation, and collaboration with stakeholders.