Drive the design and development of real time market data and analytics services on KDB+/q (GW/RDB/HDB/tickerplant, pub/sub, CEP)
Own technical delivery: requirements, solution design, coding, testing, deployment, and production support in low latency environments
Collaborate with Traders, Quants, and Technology to deliver high performance analytics (asof joins, windowed analytics, intraday aggregations)
Optimize for latency and throughput (IPC, memory layout, partitioning, attributes, OS tuning, NUMA, hugepages)
Implement and harden ETL pipelines for tick capture and reference data enrichment; integrate with Kafka and streaming services where applicable
Contribute to coding standards, code reviews, and test automation for q/KDB+ (including PyKX integration points)
Ensure observability (metrics, tracing, logging), production readiness, and on call excellence. Risk & Compliance: Appropriately assess risk in decisions, protect client reputation and assets, comply with applicable laws, regulations, and policies, apply sound ethical judgment, and escalate issues transparently
Requirements
Expertise in KDB+/q with production systems: GW/RDB/HDB design, sym/partition strategies, attributes, asof/aj/uj, IPC patterns
Strong skills in q/KDB+; working proficiency in Python/PyKX and ideally Java for integration services
Solid Linux/UNIX fundamentals (networking, OS tuning) and familiarity with TCP/IP, UDP, Multicast; knowledge of FIX/OUCH/ITCH preferred
Proven track record of profiling and optimizing for microsecond level latency (e.g., vectorization, batching, zero copy, mmap)
Strong debugging and production incident response; experience with agile delivery
Market knowledge is a plus: market microstructure, SORs, algo trading systems
Nice to have Containers/Kubernetes, CI/CD, cloud (AWS/Azure/GCP), secrets/entitlements, Terraform/Ansible
Benefits
Benefit package that can be tailored to your personal needs (private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.)
On-line training and certifications fit for career path
Access to e-learning platform Mindgram - a holistic mental health and wellbeing platform
Work From Anywhere (WFA) - the temporary option to work remotely outside of Poland for up to 140 days per year (including Italy, Spain, the UK, Germany, Portugal, and Bulgaria)
Data Engineer developing architecture and pipelines for data analytics at NinjaTrader. Empowering analysts and improving business workflows through data - driven solutions.
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.