Senior KDB Developer at GFT designing and deploying time series systems across markets. Collaborating with Trading, Quant, and Platform teams in a fast-paced environment.
Responsibilities
Drive the design and development of real time market data and analytics services on KDB+/q (GW/RDB/HDB/tickerplant, pub/sub, CEP)
Own technical delivery: requirements, solution design, coding, testing, deployment, and production support in low latency environments
Collaborate with Traders, Quants, and Technology to deliver high performance analytics (asof joins, windowed analytics, intraday aggregations)
Optimize for latency and throughput (IPC, memory layout, partitioning, attributes, OS tuning, NUMA, hugepages)
Implement and harden ETL pipelines for tick capture and reference data enrichment; integrate with Kafka and streaming services where applicable
Contribute to coding standards, code reviews, and test automation for q/KDB+ (including PyKX integration points)
Ensure observability (metrics, tracing, logging), production readiness, and on call excellence. Risk & Compliance: Appropriately assess risk in decisions, protect client reputation and assets, comply with applicable laws, regulations, and policies, apply sound ethical judgment, and escalate issues transparently
Requirements
Expertise in KDB+/q with production systems: GW/RDB/HDB design, sym/partition strategies, attributes, asof/aj/uj, IPC patterns
Strong skills in q/KDB+; working proficiency in Python/PyKX and ideally Java for integration services
Solid Linux/UNIX fundamentals (networking, OS tuning) and familiarity with TCP/IP, UDP, Multicast; knowledge of FIX/OUCH/ITCH preferred
Proven track record of profiling and optimizing for microsecond level latency (e.g., vectorization, batching, zero copy, mmap)
Strong debugging and production incident response; experience with agile delivery
Market knowledge is a plus: market microstructure, SORs, algo trading systems
Nice to have Containers/Kubernetes, CI/CD, cloud (AWS/Azure/GCP), secrets/entitlements, Terraform/Ansible
Benefits
Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
On-line training and certifications fit for career path
Access to e-learning platform Mindgram - a holistic mental health and wellbeing platform
Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.