Enable job alerts via email!

KDB+ & Apache Flink Developer

GFT Group

Toronto

Hybrid

CAD 90,000 - 120,000

Full time

5 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in the financial technology sector is seeking a KDB+/Apache Flink Developer to build high-performance analytics platforms. This hybrid role involves designing and maintaining data pipelines, optimizing applications, and collaborating with global teams to support trading and regulatory reporting.

Qualifications

  • 4+ years of experience in KDB+/q development.
  • 2+ years of experience in Apache Flink.

Responsibilities

  • Design and maintain streaming data pipelines using Apache Flink.
  • Develop and optimize KDB+/q applications for analytics.
  • Collaborate with teams for real-time decision support.

Skills

KDB+/q
Apache Flink
Kafka
Agile methodologies
Problem-solving
Communication

Education

Bachelor's degree in Computer Science

Tools

Ansible
Terraform

Job description

** Contract Opportunity Available - No Visa Sponsorship options for this role . Hybrid Work 2-3 days work from client office in Downtown Toronto OR Mississauga**

KDB+ / Apache Flink Developer

Our client is seeking a hands-on developer with experience in KDB+/q and Apache Flink to support real-time and historical data processing needs within its Markets Technology group. This role focuses on building high-performance analytics platforms to support trading, surveillance, market data, and regulatory reporting, using in-house and open-source Flink deployments (not Confluent).

Key Responsibilities

  • Design, implement, and maintain streaming data pipelines using Apache Flink (open-source, non-Confluent) to process high-volume trade and market data.
  • Develop and optimize KDB+/q applications for storing and analysing tick data, order books, and market microstructure analytics.
  • Collaborate with traders, quants, and risk teams to deliver solutions for real-time decision support and post-trade analysis.
  • Integrate Flink-based services with Kafka, internal APIs, and downstream data consumers.
  • Support internal infrastructure teams on deployment, monitoring, and tuning of Apache Flink clusters.

Required Qualifications & Experience

  • 4+ years of experience in KDB+/q development for real-time and historical analytics platforms.
  • 2+ years of experience in Apache Flink (open-source, self-managed or Citi-hosted environments).
  • Strong understanding of event time processing, stateful stream processing, and checkpointing in Flink.
  • Experience with Kafka for stream ingestion and distribution.
  • Familiarity with low-latency system design, performance tuning, and distributed computing.
  • Strong communication skills and ability to work with global teams across Markets and Risk.
  • Solid understanding of Agile methodologies and CI/CD processes.
  • Strong problem-solving skillswith the ability to prioritize multiple tasks, set goals, and meet deadlines.
  • Excellent communication skills, capable of articulating complex technical concepts in a multicultural team environment.
  • Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience).
  • Prior experience in Equities, FX, or Fixed Income trading technology.
  • Exposure to on-prem or cloud-based deployments of Apache Flink (non-Confluent).
  • Familiarity with CI/CD tools and infrastructure-as-code frameworks (e.g., Ansible, Terraform).
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.