Enable job alerts via email!

Senior Data Engineer

SF Recruitment

Birmingham

Hybrid

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Job summary

A leading UK company is seeking a Senior Data Engineer to lead a major data transformation initiative. You will design and implement an Azure-based data platform, focusing on building data pipelines and enabling analytics through Power BI. Ideal candidates will have strong SQL skills and experience with Azure tools. This is a hybrid position located in Birmingham offering significant architectural influence.

Responsibilities

  • Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric.
  • Implement a data lakehouse architecture and establish best-practice ETL/ELT frameworks.
  • Integrate data from multiple core systems, including ERP, finance, and supply chain.
  • Develop and optimise SQL data models and support Power BI datasets.
  • Collaborate with Finance and IT to translate reporting needs.
  • Monitor and optimise data pipelines for performance.
  • Define standards and documentation for scalability.

Skills

Azure data platforms development
SQL development
ERP system integration
Python or PySpark
Data governance and security
Preparing data for Power BI
Excellent communication skills

Tools

Azure Data Factory
Azure Synapse
Databricks
Power BI
Job description

Position: Senior Data Engineer

Hybrid - Birmingham

6 months - Outside IR35

Overview

Join a leading UK company as a Senior Data Engineer and play a key role in a major data transformation project. You will have the opportunity to design and deliver a new Azure-based data platform, modernising the organisation's data management and reporting processes. This hands‑on role offers architectural influence and is ideal for an experienced engineer with a strong background in setting up new environments, creating data pipelines, and enabling self‑service analytics through Power BI.

Key Responsibilities
  • Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric.
  • Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best‑practise ETL/ELT frameworks.
  • Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms.
  • Develop and optimise SQL data models and support the creation of Power BI‑ready datasets.
  • Apply and document data governance, quality, and validation rules within the platform.
  • Collaborate with Finance and IT stakeholders to translate reporting needs into technical solutions.
  • Monitor, troubleshoot, and optimise data pipelines for performance and cost efficiency.
  • Define reusable components, standards, and documentation to support long‑term scalability.
Essential Skills & Experience
  • Proven experience building Azure data platforms end‑to‑end (Data Factory, Synapse, Fabric, or Databricks).
  • Strong SQL development and data modelling capability.
  • Experience integrating ERP or legacy systems into cloud data platforms.
  • Proficiency in Python or PySpark for transformation and automation.
  • Understanding of data governance, access control, and security within Azure.
  • Hands‑on experience preparing data for Power BI or other analytics tools.
  • Excellent communication skills - able to bridge technical and non‑technical stakeholders.
  • Strong documentation habits and attention to detail.
Desirable Skills & Experience
  • Experience with AS400, Tagetik, or similar finance systems.
  • Familiarity with Power BI Premium, RLS, and workspace governance.
  • Knowledge of Azure DevOps and CI/CD for data pipelines.
  • Exposure to data quality tools or frameworks.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.