Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Balyasny Asset Management LP

City of Westminster

On-site

GBP 50,000 - 80,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading hedge fund in London is seeking a Data Engineer to work on infrastructure management and data analytics. This opportunity involves ownership of the investment team's data pipeline, collaborating with Analysts to optimize data processes and support investment decisions. Ideal candidates will have experience in building ETL/ELT pipelines, proficiency in Python and SQL, and excellent communication skills. Join a dynamic team focused on innovative data-driven insights.

Qualifications

  • 1 to 5 years of experience in building and managing ETL/ELT data pipelines.
  • Proficient in Python3 with a strong focus on common data libraries.
  • Experience with Apache Airflow for workflow management.

Responsibilities

  • Collaborate with Analysts and Portfolio Manager to develop creative uses for data.
  • Collect and clean structured and unstructured data from various sources.
  • Identify and improve existing infrastructure solutions for data management.

Skills

Data Engineering
Data Analysis
Python
SQL
Communication
Attention to detail

Education

Bachelor's or master's degree in computer science or related field

Tools

Apache Airflow
Microsoft Excel
AWS
PostgreSQL
Snowflake
Job description

Balyasny Asset Management is looking for an exceptional data engineer to work with an Industrials portfolio team in London on projects related to infrastructure management, data analysis and data-driven idea generation. We are looking for someone with expertise in Data Engineering & Data Analytics who is interested in applying their skillset to a markets facing role. This is an excellent opportunity to take full ownership of a fundamental investment team's data pipeline at a leading hedge fund, offering hands‑on experience to work at the intersection of data analysis and investing.

Responsibilities
  • Collaborate with Analysts and Portfolio Manager to develop creative uses for data in the investment process
  • Collect structured and unstructured data from various sources (e.g., websites, PDF documents, e‑mails, etc.), clean, transform and store this data in a format and in a storage location that ease the consumption of this data for analysis (e.g., Excel)
  • Identify opportunities to improve existing infrastructure, such as optimizing data storage solutions or streamlining the data ingestion process to increase the volume, velocity, and variety of the ingested data
  • Develop and expand team data infrastructure to capture new data streams and automate the end‑to‑end ETL/ELT process
  • Support investment decisions through independent research on various new datasets, pinpointing trends, correlations, and patterns in complex datasets
  • Effectively communicate technical details and insights to non‑technical team members
  • Take complete ownership of data pipeline as a fully integrated member of the team
Qualifications
  • Bachelor's or master's degree in computer science, Mathematics, Physics or quantitative field from top schools
  • Prior training in a quantitative scientific field that uses computational data analysis (e.g., computer science, statistics, applied mathematics, physics, engineering, economics/econometrics, chemistry/biology)
  • 1 to 5 years of experience in building and managing ETL/ELT data pipelines
  • Proficient in Python3 with a strong focus on the most common data libraries (e.g., pandas, NumPy) and SQL
  • Experience with Apache Airflow for workflow management
  • Proficient with Microsoft Excel
  • Knowledge of the Amazon AWS data ecosystem
  • Expertise in setting up, maintaining and fine‑tuning SQL databases (e.g., PostgreSQL and Snowflake)
  • Excellent communication skills, with the ability to explain technical concepts to non‑technical users
  • Attention to detail and exceptionally motivated, hard‑working, and a self‑starter combined with the highest integrity and character
Nice to Have
  • Experience with data visualization tools such as Tableau or Streamlit
  • Experience with Docker and containerized architectures (e.g., Kubernetes, AWS ECS)
  • Experience with real‑time data‑streaming e.g. Kafka
  • Experience with GitHub and Jenkins
  • Basic understanding of markets and financial statements

Only apply if your profile fits the listed requirements. Understand that we have a large volume of applicants and cannot reply to each one. Thanks for your interest in Balyasny. If your profile is suitable, we will reach out.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.