Enable job alerts via email!

Senior Data Engineer | Systematic Trading

JR United Kingdom

London

On-site

GBP 60,000 - 100,000

Full time

2 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a Senior Data Engineer to architect and scale a global Data Lakehouse, crucial for live trading strategies. This role offers the opportunity to transition from maintaining legacy systems to building a cutting-edge platform that supports quant researchers and traders. You'll be responsible for developing low-latency, high-availability data pipelines in a cloud-native environment, primarily using Python and SQL. If you have a strong background in systematic trading or fintech and are eager to make a significant impact, this position is for you.

Qualifications

  • 5+ years of experience in systematic trading or fintech environments.
  • Proficient in Python and SQL, with a strong grasp of data modeling.

Responsibilities

  • Architect and build a global Lakehouse platform in AWS.
  • Develop scalable data pipelines and optimize performance for high-concurrency access.

Skills

Python
SQL
C++
Data Modeling
Cloud Infrastructure
Data Pipelines

Tools

AWS

Job description

Social network you want to login/join with:

Senior Data Engineer | Systematic Trading, London

Client:

NJF Global Holdings Ltd

Location:

London, United Kingdom

Job Category:

Other

EU work permit required:

Yes

Job Views:

4

Posted:

28.04.2025

Expiry Date:

12.06.2025

Job Description:

Senior Data Engineer | Systematic Trading

A Top-tier quant fund is hiring a Senior Data Engineer to help architect and scale a global Data Lakehouse that feeds directly into live trading strategies. If you’ve already been building data infrastructure inside a hedge fund or HFT shop, you’ll know what this role really means: low-latency, high-availability, globally distributed data pipelines that quant teams trust every day.

This is your chance to level up from patching legacy pipelines to building the core platform used by quant researchers and traders across desks and regions.

What You’ll Be Driving

  • Architect and build a global Lakehouse platform (Data Lake + OLAP) in AWS
  • Develop scalable data pipelines in Python, working with SDKs and data libs
  • Own end-to-end data modeling, ingestion, validation, and optimization for high-concurrency access
  • Tune performance of cross-region, multi-format data stores (columnar, real-time, etc.)
  • Deliver tailored solutions directly to quants and traders – real impact, real visibility
  • Take the lead on cloud infra design, deployment, and automation

What You Bring

  • 5+ years building data platforms in systematic trading, fintech, or high-throughput environments
  • Mastery of Python and SQL – C++ is a plus
  • Experience with cloud-native stacks (ideally AWS) and interest in infrastructure as code
  • A structured, pragmatic mindset with strong ownership and autonomy
  • Ability to work closely with front-office teams and iterate fast
  • Finance background is a strong plus — you get why PnL depends on good data
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.