Senior KDB Developer at GFT designing and deploying time series systems across markets. Collaborating with Trading, Quant, and Platform teams in a fast-paced environment.
Responsibilities
Drive the design and development of real time market data and analytics services on KDB+/q (GW/RDB/HDB/tickerplant, pub/sub, CEP)
Own technical delivery: requirements, solution design, coding, testing, deployment, and production support in low latency environments
Collaborate with Traders, Quants, and Technology to deliver high performance analytics (asof joins, windowed analytics, intraday aggregations)
Optimize for latency and throughput (IPC, memory layout, partitioning, attributes, OS tuning, NUMA, hugepages)
Implement and harden ETL pipelines for tick capture and reference data enrichment; integrate with Kafka and streaming services where applicable
Contribute to coding standards, code reviews, and test automation for q/KDB+ (including PyKX integration points)
Ensure observability (metrics, tracing, logging), production readiness, and on call excellence. Risk & Compliance: Appropriately assess risk in decisions, protect client reputation and assets, comply with applicable laws, regulations, and policies, apply sound ethical judgment, and escalate issues transparently
Requirements
Expertise in KDB+/q with production systems: GW/RDB/HDB design, sym/partition strategies, attributes, asof/aj/uj, IPC patterns
Strong skills in q/KDB+; working proficiency in Python/PyKX and ideally Java for integration services
Solid Linux/UNIX fundamentals (networking, OS tuning) and familiarity with TCP/IP, UDP, Multicast; knowledge of FIX/OUCH/ITCH preferred
Proven track record of profiling and optimizing for microsecond level latency (e.g., vectorization, batching, zero copy, mmap)
Strong debugging and production incident response; experience with agile delivery
Market knowledge is a plus: market microstructure, SORs, algo trading systems
Nice to have Containers/Kubernetes, CI/CD, cloud (AWS/Azure/GCP), secrets/entitlements, Terraform/Ansible
Benefits
Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
On-line training and certifications fit for career path
Access to e-learning platform Mindgram - a holistic mental health and wellbeing platform
Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.