Enable job alerts via email!

Senior Data Engineer

Flannery Plant Hire (Oval) Ltd.

City Of London

On-site

GBP 60,000 - 80,000

Full time

Yesterday
Be an early applicant

Job summary

A leading plant hire company in London seeks a Senior Data Engineer to design and maintain scalable data pipelines and enhance Power BI capabilities. The ideal candidate has over 5 years of experience, strong SQL and programming skills, and a practical understanding of data governance and cloud data models. This role includes competitive benefits and a focus on professional growth.

Benefits

Competitive salary and package
24 days plus bank holiday annual leave
Training and professional development

Qualifications

  • 5+ years as a data engineer delivering production-grade solutions.
  • Proficiency in a programming language, preferably Python.
  • Hands-on experience with ETL/ELT processes and data models.

Responsibilities

  • Design and maintain scalable data pipelines.
  • Model robust data products for analytics.
  • Implement data governance, quality, and security.

Skills

Strong SQL
Python programming
ETL/ELT pipeline experience
Data governance knowledge
Power BI optimization

Tools

Microsoft Fabric
Azure
Git
Job description
Overview

Senior Data Engineer

Wembley - Office based

The Opportunity

We’re looking for a Senior Data Engineer who pairs solid engineering fundamentals with an analytical mindset. You’ll build reliable data foundations, enable high-quality reporting and self-serve analytics, and help us take our Power BI capability to the next level. We care more about attitude, work ethic, and proven delivery than formal qualifications.

What you’ll do
  • Design, build, and maintain scalable data pipelines and ELT/ETL processes across telematics, ERP, IoT, CRM, and finance systems.
  • Model and deliver robust data products (lakehouse/warehouse, marts, semantic models) that power BI, analytics, and data science.
  • Lead our use of Microsoft Fabric for ingestion, transformation, and analytics, including Lakehouse, Warehouse, Pipelines, Notebooks, Dataflows Gen2, and OneLake.
  • Build and optimise Power BI datasets and semantic models (including Direct Lake and incremental refresh) in partnership with analysts.
  • Implement data governance, quality, lineage, and security-by-design; drive documentation, testing, and version control standards.
  • Tune performance across SQL, storage, and BI layers; manage cost and reliability in the cloud (Azure preferred).
  • Enable streaming and near-real-time use cases where needed (e.g., telematics/IoT) using appropriate services.
  • Champion modern DataOps practices (CI/CD, environment management, automation) and mentor junior team members.
  • Collaborate closely with stakeholders to translate business needs into well-designed data solutions and clear metrics.
What you’ll bring (essential)
  • Proven experience (5+ years) as a data engineer delivering production-grade data solutions.
  • Strong SQL and proficiency in at least one programming language (Python preferred; Scala/Java also valuable).
  • Hands-on experience building and operating ETL/ELT pipelines and data models (Kimball, Data Vault, or similar).
  • Cloud data experience (Azure preferred), including storage, compute, and security fundamentals.
  • Practical Power BI experience as a data engineer: shaping data models, optimising for performance, and collaborating with analysts.
  • Solid grasp of data governance, privacy, and security best practices; comfortable with Git and documentation.
  • Bias to action, ownership, and continuous improvement; capable of balancing speed with quality.
Nice to have
  • Microsoft Fabric: creating and managing a Fabric Warehouse or Lakehouse (and underlying “database” objects), building Pipelines/Notebooks, using OneLake, and enabling Direct Lake for Power BI.
  • Experience with telematics, IoT, logistics, or construction data.
  • Streaming platforms and APIs (e.g., Kafka, Event Hubs) and batch orchestration.
  • DataOps/CI-CD for data (Azure DevOps/GitHub Actions), dbt, unit/integration testing for data.
  • DAX fundamentals and Power BI performance tuning at scale.
  • Formal qualifications or certifications (e.g., Azure DP-203, Microsoft Fabric Analytics Engineer) are welcome but not required.
What success looks like in 6–12 months
  • Reliable, well-documented pipelines feeding a governed lakehouse/warehouse.
  • Faster, more trusted Power BI reporting backed by high-quality semantic models.
  • Clear data standards adopted across the team, with CI/CD and automated testing in place.
  • Demonstrable cost, performance, and data quality improvements on key domains (e.g., telematics, operations, finance).
Benefits
  • Competitive salary and package.
  • 24 days plus bank holiday annual leave, plus personal leave.
  • Training and professional development tailored to you.
  • Employee Assistance Programme.
  • Strong safety and sustainability culture.
  • Modern equipment and supportive, team-oriented environment.
  • Recognition programmes for outstanding performance.

If you’re a hands-on engineer who loves building practical, scalable data solutions and wants to grow an organisation’s Power BI and Microsoft Fabric capability, we’d like to hear from you.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.