Enable job alerts via email!

Senior Data Engineer - McGregor Boyall

ZipRecruiter

London

Hybrid

GBP 80,000 - 95,000

Full time

9 days ago

Job summary

A global financial services firm in London is seeking a Senior Data Engineer to develop and maintain essential data pipelines for their brokerage business. You'll work in a hybrid model, utilizing strong Python and SQL skills to ensure high-quality data solutions. This hands-on role requires experience with Airflow, cloud environments, and managing complex datasets. Join the team to foster data quality and improve analytical capabilities while collaborating with various stakeholders.

Qualifications

  • Proven experience designing data pipelines in cloud environments.
  • Hands-on experience with Airflow (ideally MWAA).
  • Background working with large, complex datasets.

Responsibilities

  • Design and build robust ETL pipelines using Python and AWS services.
  • Own and maintain Airflow workflows.
  • Ensure high data quality through rigorous testing and validation.

Skills

Strong Python skills
SQL skills
Experience with Airflow
Designing data pipelines
Attention to detail

Tools

Python
SQL/PLSQL
Apache Airflow
AWS Glue
Git

Job description

Job Description

Senior Data Engineer
Up to £95,000 + benefits | Hybrid (3 days a week in City of London)

A global financial services firm is hiring a Senior Data Engineer to join their Brokerage technology team.

You'll be building and maintaining the data pipelines that underpin their £1bn+ broking business - with a strong focus on improving brokerage data quality, optimising commercial analysis, and supporting complex client agreements.

This is a hands-on engineering role working closely with stakeholders and system owners. You'll be expected to code daily (Python), manage Airflow pipelines (MWAA), build ETL processes from scratch, and improve existing workflows for better performance and scalability.

Key responsibilities

  • Design and build robust ETL pipelines using Python and AWS services

  • Own and maintain Airflow workflows

  • Ensure high data quality through rigorous testing and validation

  • Analyse and understand complex data sets before pipeline design

  • Collaborate with stakeholders to translate business requirements into data solutions

  • Monitor and improve pipeline performance and reliability

  • Maintain documentation of systems, workflows, and configs

Tech environment

  • Python, SQL/PLSQL (MS SQL + Oracle), PySpark

  • Apache Airflow (MWAA), AWS Glue, Athena

  • AWS services (CDK, S3, data lake architectures)

  • Git, JIRA

You should apply if you have:

  • Strong Python and SQL skills

  • Proven experience designing data pipelines in cloud environments

  • Hands-on experience with Airflow (ideally MWAA)

  • Background working with large, complex datasets

  • Experience in finance or similar high-volume, regulated industries ( but not essential)

  • High attention to detail and a clear commitment to data quality

McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs