Enable job alerts via email!

Senior Data Engineer

DataArt

City Of London

On-site

GBP 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A leading technology company in London is seeking an experienced Senior Data Engineer to oversee data onboarding and transformation for financial services. The role requires strong expertise in Snowflake, DBT, and Azure Data Factory, along with solid knowledge of SQL and Python. As part of a dynamic team, the ideal candidate will collaborate on data orchestration and pipeline development, ensuring high-quality data management practices.

Qualifications

  • Production experience as a Data Engineer in asset management or financial services.
  • Strong expertise in Snowflake, DBT, and data pipeline orchestration tools (Azure Data Factory).
  • Strong knowledge of SQL, Python, data modeling, and warehousing principles.

Responsibilities

  • Onboard new datasets and develop data models using Snowflake and DBT.
  • Build and maintain data transformation pipelines.
  • Design and manage data orchestration and ETL workflows with Azure Data Factory.

Skills

Snowflake
DBT
Data pipeline orchestration
SQL
Python
Data modeling
Azure cloud services
DevOps

Tools

Azure Data Factory
Job description
Overview

Client

Our client is a hedge fund sponsor that mainly manages pooled investment vehicles and typically invests in fixed income, private equity, rates, credit, and foreign exchange. The company operates offices in London, New York, and Hong Kong.

Position overview

We are seeking an experienced Senior Data Engineer with experience in asset management or financial services to join our team. The ideal candidate will have expertise handling diverse datasets via batch files, APIs, and streaming from both internal and external sources. This position is open in London as require present on client office from time to time.

Responsibilities
  • Onboard new datasets and develop data models using Snowflake and DBT
  • Build and maintain data transformation pipelines
  • Design and manage data orchestration and ETL workflows with Azure Data Factory
  • Optimize queries and apply data warehousing best practices for large and complex datasets
  • Collaborate with development teams using agile methodologies, DevOps, Git, and CI/CD pipelines
  • Support cloud-based services, especially Azure Functions, KeyVault, and LogicApps
  • Optionally develop APIs to serve data to internal or external stakeholders
Requirements
  • Production experience as a Data Engineer in asset management or financial services
  • Strong expertise in Snowflake, DBT, and data pipeline orchestration tools (Azure Data Factory)
  • Strong knowledge of SQL, Python, data modeling, and warehousing principles
  • Familiarity with DevOps practices including CI/CD and version control (Git)
  • Experience with Azure cloud services
Nice to have
  • Industry knowledge of Security Master, IBOR, and Portfolio Management
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.