Senior KDB Developer at GFT designing and deploying time series systems across markets. Collaborating with Trading, Quant, and Platform teams in a fast-paced environment.
Responsibilities
Drive the design and development of real time market data and analytics services on KDB+/q (GW/RDB/HDB/tickerplant, pub/sub, CEP)
Own technical delivery: requirements, solution design, coding, testing, deployment, and production support in low latency environments
Collaborate with Traders, Quants, and Technology to deliver high performance analytics (asof joins, windowed analytics, intraday aggregations)
Optimize for latency and throughput (IPC, memory layout, partitioning, attributes, OS tuning, NUMA, hugepages)
Implement and harden ETL pipelines for tick capture and reference data enrichment; integrate with Kafka and streaming services where applicable
Contribute to coding standards, code reviews, and test automation for q/KDB+ (including PyKX integration points)
Ensure observability (metrics, tracing, logging), production readiness, and on call excellence. Risk & Compliance: Appropriately assess risk in decisions, protect client reputation and assets, comply with applicable laws, regulations, and policies, apply sound ethical judgment, and escalate issues transparently
Requirements
Expertise in KDB+/q with production systems: GW/RDB/HDB design, sym/partition strategies, attributes, asof/aj/uj, IPC patterns
Strong skills in q/KDB+; working proficiency in Python/PyKX and ideally Java for integration services
Solid Linux/UNIX fundamentals (networking, OS tuning) and familiarity with TCP/IP, UDP, Multicast; knowledge of FIX/OUCH/ITCH preferred
Proven track record of profiling and optimizing for microsecond level latency (e.g., vectorization, batching, zero copy, mmap)
Strong debugging and production incident response; experience with agile delivery
Market knowledge is a plus: market microstructure, SORs, algo trading systems
Nice to have Containers/Kubernetes, CI/CD, cloud (AWS/Azure/GCP), secrets/entitlements, Terraform/Ansible
Benefits
Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
On-line training and certifications fit for career path
Access to e-learning platform Mindgram - a holistic mental health and wellbeing platform
Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)
Full - Stack Data Engineer designing and optimizing complex data solutions for automotive content. Collaborating with teams to enhance user experience across MOTOR's product lines.
Principal Data Engineer designing and evolving enterprise data platform. Collaborating with analytics teams to enable AI and data products at American Tower.
BI Data Engineer II supporting scalable Lakehouse data pipelines at Boston Beer Company. Collaborating with stakeholders to drive data ingestion and maintain enterprise data quality.
Senior Data Engineer at A Kube Inc responsible for building and maintaining data pipelines for product performance. Collaborating with product, engineering, and analytics teams to ensure data quality and efficiency.
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.