Job Search and Career Advice Platform

Enable job alerts via email!

KDB Developer

GFT Group

Toronto

On-site

CAD 90,000 - 120,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology firm in Toronto is searching for a skilled KDB+/q Developer. This hybrid role involves designing and maintaining high-performance analytics platforms for trading and market data processing. Candidates should have strong experience in KDB+/q development and a solid understanding of real-time data systems and trading workflows. Excellent collaboration skills are important as you will work closely with teams across the organization.

Qualifications

  • 3+ years of experience in KDB+/q development for real-time and historical analytics.
  • Experience with memory and CPU tuning.
  • Solid understanding of Agile methodologies and CI/CD processes.

Responsibilities

  • Design, implement, and maintain streaming data pipelines using Apache Flink.
  • Collaborate with traders and risk teams for real-time support.
  • Support infrastructure teams in deployment and monitoring.

Skills

KDB+/q development
Query optimization
Real-time data processing
Kafka for stream ingestion
Low-latency system design
Agile methodologies
Problem-solving

Education

Bachelor's degree in Computer Science, Engineering, or a related field

Tools

Apache Flink
Ansible
Terraform
Job description
Contract & Fulltime Opportunity Available - No Visa Sponsorship options for this role! Hybrid Work 2-3 days work from client office in Downtown Toronto OR Mississauga**
KDB+/q Developer

Our client is seeking a hands‑on developer with experience in KDB+/q to support real‑time and historical data processing needs within its Markets Technology group. This role focuses on building high‑performance analytics platforms to support trading, surveillance, market data, and regulatory reporting, using in‑house and open‑source Flink deployments (not Confluent).

Key Responsibilities
  • Design, implement, and maintain streaming data pipelines using Apache Flink (open‑source, non‑Confluent) to process high‑volume trade and market data.
  • Develop and optimize KDB+/q applications for storing and analysing tick data, order books, and market microstructure analytics.
  • Collaborate with traders, quants, and risk teams to deliver solutions for real‑time decision support and post‑trade analysis.
  • Integrate Flink‑based services with Kafka, internal APIs, and downstream data consumers.
  • Support internal infrastructure teams on deployment, monitoring, and tuning of clusters.
Required Qualifications & Experience
  • 3+ years of experience in KDB+/q development for real‑time and historical analytics platforms.
  • 2+ years of experience in queryoptimization,schemadesign forlargetime‑seriesdatasets, real‑timedataprocessing,memoryand CPUtuning
  • Strong understanding of trading workflows, market data feeds, and risk systems in investment banking
  • Experience with Kafka for stream ingestion and distribution.
  • Familiarity with low‑latency system design, performance tuning, and distributed computing.
  • Strong communication skills and ability to work with global teams across Markets and Risk.
  • Solid understanding of Agile methodologies and CI/CD processes.
  • Strong problem‑solving skillswith the ability to prioritize multiple tasks, set goals, and meet deadlines.
  • Excellent communication skills, capable of articulating complex technical concepts in a multicultural team environment.
  • Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Prior experience in Equities, FX, or Fixed Income trading technology.
  • Familiarity with CI/CD tools and infrastructure‑as‑code frameworks (e.g., Ansible, Terraform).
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.