Enable job alerts via email!

Software Developer - Data Pipelines (Python)

Squarepoint Capital

London

On-site

GBP 60,000 - 80,000

Full time

2 days ago
Be an early applicant

Job summary

A leading data services firm in London is looking for an experienced Software Developer specializing in data pipelines. This role involves designing and optimizing data pipelines, collaborating with quant researchers, and enhancing overall system efficiency. Ideal candidates should have over 4 years of Python experience and a STEM degree. Join a diverse team committed to learning and innovation.

Qualifications

  • 4+ years of experience coding to a high standard in Python.
  • Experience with SQL and one or more common RDBMS systems.
  • Practical knowledge of data transfer protocols and tools.

Responsibilities

  • Take part ownership of the ever-growing estate of data pipelines.
  • Design, implement, test, optimize and troubleshoot data pipelines.
  • Collaborate with researchers to onboard new datasets.

Skills

Python
SQL
Data transformation
Collaboration

Education

Bachelor's degree in a STEM subject

Tools

Postgres
AWS S3

Job description

Social network you want to login/join with:

Software Developer - Data Pipelines (Python), London

col-narrow-left

Client:

Squarepoint Capital

Location:

London, United Kingdom

Job Category:

Other

-

EU work permit required:

Yes

col-narrow-right

Job Reference:

188749070971

Job Views:

68

Posted:

12.08.2025

Expiry Date:

26.09.2025

col-wide

Job Description:

Position Overview:

We are seeking an experienced Python developer to join our Alpha Data team, responsible for delivering a vast quantity of data served to users worldwide. You will be a cornerstone of a growing Data team, becoming a technical subject matter expert and developing strong working relationships with quant researchers, traders, and fellow colleagues across our Technology organisation.

Alpha Data teams are able to deploy valuable data to the rest of the Squarepoint business at speed. Ingestion pipelines and data transformation jobs are resilient and highly maintainable, while the data models are carefully designed in close collaboration with our researchers for efficient query construction and alpha generation.

We achieve an economy of scale through building new frameworks, libraries, and services used to increase the team's quality of life, throughput, and code quality. Teamwork and collaboration are encouraged, excellence is rewarded and diversity of thought and creative solutions are valued. Our emphasis is on a culture of learning, development, and growth.

  • Take part ownership of our ever-growing estate of data pipelines,
  • Propose and contribute to new abstractions and improvements - make a real positive impact across our team globally,
  • Design, implement, test, optimize and troubleshoot our data pipelines, frameworks, and services,
  • Collaborate with researchers to onboard new datasets,
  • Regularly take the lead on production support operations - during normal working hours only.

Required Qualifications:

  • 4+ years of experience coding to a high standard in Python,
  • Bachelor's degree in a STEM subject,
  • Experience with and knowledge of SQL, and one or more common RDBMS systems (we mostly use Postgres),
  • Practical knowledge of commonly used protocols and tools used to transfer data (e.g. FTP, SFTP, HTTP APIs, AWS S3),

Nice to haves

  • Experience with big data frameworks, databases, distributed systems, or Cloud development.
  • Experience with any of these: C++, kdb+/q, Rust.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs