Enable job alerts via email!

Data Engineer

Aspect Capital

London

On-site

GBP 40,000 - 65,000

Full time

30+ days ago

Job summary

Aspect Capital, a leading systematic hedge fund in London, is looking for a Data Engineer to join its team. This role involves building streaming platforms, developing ELT pipelines, and enhancing client libraries while working closely with quantitative professionals. The position requires expertise in Python, SQL, as well as familiarity with cloud platforms and modern data engineering tools, ideal for candidates passionate about technology and continuous improvement.

Qualifications

  • 1-3 years working as a Data Engineer.
  • Expertise with Python and SQL.
  • Understanding of core database concepts.

Responsibilities

  • Building a streaming platform to capture and aggregate large volumes of tick data.
  • Developing ELT pipelines to ingest and transform datasets with Python, Snowflake and dbt.
  • Working closely with quantitative developers, researchers and portfolio managers.

Skills

Python
SQL
SDLC
DevOps
Data Engineering
Communication

Tools

Git
Docker
Jenkins
Snowflake
dbt
Kafka
Airflow

Job description

Job description

Aspect Capital is an award-winning systematic hedge fund based in London that manages over $8 billon of client assets, where technology is an integral part of the business. We are looking for an engineer to join our Data Engineering team. The team's role is broad, covering the ingestion, storage, transformation and distribution of tick, timeseries, reference and alternative datasets. The technology stack is similarly varied including a range of legacy and modern systems, across on-premises and cloud infrastructure.

This is an exciting time to join the team as we consolidate our technology estate, revamp how we process and filter data, and overhaul the way data is accessed by our consumers, while continuing to onboard new datasets that enhance our strategies. We are a lean team owning end-to-end delivery from initial design through to operational support in production.

Job requirements

  • 1-3 years working as a Data Engineer
  • Expertise with Python and SQL
  • Understanding of core database concepts
  • Familiarity with a cloud platform or data warehouse
  • SDLC and DevOps: Git, Docker, Jenkins/TeamCity, monitoring, automated testing
  • Ability to communicate clearly with technical and non-technical colleagues
Experience in the following areas would be ideal:

  • dbt and Snowflake
  • Kafka
  • Airflow
Job responsibilities

  • Building a streaming platform to capture and aggregate large volumes of tick data
  • Developing ELT pipelines to ingest and transform datasets with Python, Snowflake and dbt
  • Enhancing client libraries to provide unified access to our entire data catalogue
  • Supporting our Java live data feedhandlers
  • Consolidating legacy MATLAB systems onto our strategic technology stack
  • Working closely with quantitative developers, researchers and portfolio managers
If you are passionate about technology, stay current with industry trends, follow engineering best practices, and are always looking for opportunities to improve systems, processes, and performance, then we would love to hear from you.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.