Enable job alerts via email!

KDB Developer

GFT

Toronto

Hybrid

CAD 80,000 - 120,000

Full time

Yesterday
Be an early applicant

Job summary

A leading technology firm is seeking a KDB / q Developer in Toronto. You will design and maintain streaming data pipelines, optimize KDB / q applications, and collaborate with traders for real-time decision support. The ideal candidate has 3+ years of experience in KDB / q development, strong problem-solving abilities, and a bachelor’s degree in a relevant field. This role offers a hybrid work model.

Qualifications

  • 3 years of experience in KDB / q development for real-time and historical analytics platforms.
  • Strong understanding of trading workflows, market data feeds and risk systems.
  • Excellent communication skills capable of articulating complex technical concepts.

Responsibilities

  • Design, implement and maintain streaming data pipelines using Apache Flink.
  • Develop and optimize KDB / q applications for storing and analysing trade data.
  • Collaborate with traders and risk teams to deliver solutions for decision support.

Skills

KDB / q development
Real-time analytics
Query optimization
Kafka
Agile methodologies
CI / CD processes
Problem-solving
Communication skills

Education

Bachelor's degree in Computer Science, Engineering or related field

Tools

Apache Flink
Ansible
Terraform
Job description

Contract & Fulltime Opportunity Available - No Visa Sponsorship options for this role!

Hybrid Work 2-3 days work from client office in Downtown Toronto OR Mississauga

KDB / q Developer

Our client is seeking a hands-on developer with experience in KDB / q to support real-time and historical data processing needs within its Markets Technology group. This role focuses on building high-performance analytics platforms to support trading surveillance market data and regulatory reporting using in-house and open-source Flink deployments (not Confluent).

Responsibilities
  • Design implement and maintain streaming data pipelines using Apache Flink (open-source non-Confluent) to process high-volume trade and market data.
  • Develop and optimize KDB / q applications for storing and analysing tick data order books and market microstructure analytics.
  • Collaborate with traders quants and risk teams to deliver solutions for real-time decision support and post-trade analysis.
  • Integrate Flink-based services with Kafka internal APIs and downstream data consumers.
  • Support internal infrastructure teams on deployment monitoring and tuning of clusters.
Required Qualifications & Experience
  • 3 years of experience in KDB / q development for real-time and historical analytics platforms.
  • 2 years of experience in query optimization, schema design for large time-series data sets, real-time data processing, memory and CPU tuning
  • Strong understanding of trading workflows, market data feeds and risk systems in investment banking
  • Experience with Kafka for stream ingestion and distribution.
  • Familiarity with low-latency system design, performance tuning and distributed computing.
  • Strong communication skills and ability to work with global teams across Markets and Risk.
  • Solid understanding of Agile methodologies and CI / CD processes.
  • Strong problem-solving skills with the ability to prioritize multiple tasks, set goals and meet deadlines.
  • Excellent communication skills capable of articulating complex technical concepts in a multicultural team environment.
  • Bachelors degree in Computer Science, Engineering or a related field (or equivalent experience).
  • Prior experience in Equities FX or Fixed Income trading technology.
  • Familiarity with CI / CD tools and infrastructure-as-code frameworks (e.g. Ansible Terraform).
Key Skills

CCTV,Computer Science,Corporate Marketing,E Learning,Arabic English Translation

Employment Details
  • Employment Type: Full Time
  • Experience: years
  • Vacancy: 1
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.