Enable job alerts via email!

PowerBI Engineer

Head Resourcing Ltd

Glasgow

Hybrid

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading UK consumer brand in Glasgow is seeking a Power BI Report Engineer to transform its reporting platform using Azure and Databricks. The ideal candidate will specialize in semantic modeling, dataset optimization, and governance, ensuring clean, trusted data insights. Responsibilities include using PBIP and Git for dataset management, optimizing Power BI performance, and collaborating with data engineering teams. Required skills include DAX, SQL, and Git-based workflows.

Qualifications

  • 3-5+ years building enterprise Power BI datasets and dashboards.
  • Strong ability to optimize dataset performance.
  • Excellent design intuition and clean layout skills.

Responsibilities

  • Build and maintain enterprise PBIP datasets fully version-controlled in Git.
  • Develop semantic models on curated Gold Databricks tables.
  • Optimize performance and tune Databricks SQL Warehouse queries.

Skills

DAX expertise
Semantic modelling
SQL skills
Power BI data analytics
Git-based workflows

Education

PL-300: Power BI Data Analyst Associate
DP-600: Fabric Analytics Engineer Associate

Tools

Azure DevOps
Databricks
Tabular Editor 3
Job description
Power BI Report Engineer (Azure / Databricks)
Glasgow | 3-4 days onsite | Exclusive Opportunity with a Leading UK Consumer Brand

Are you a Power BI specialist who loves clean, governed data and high-performance semantic models? Do you want to work with a business that’s rebuilding its entire BI estate the right way-proper Lakehouse architecture, curated Gold tables, PBIP, Git, and end-to-end governance? If so, this is one of the most modern, forward-thinking Power BI engineering roles in Scotland. Our Glasgow-based client is transforming its reporting platform using Azure + Databricks, with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3, and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and Customer Care‑turning trusted Lakehouse data into insights the business relies on every day.

? Why This Role Exists

To turn clean, curated Gold Lakehouse data into trusted, enterprise‑grade Power BI insights. You’ll own semantic modelling, dataset optimisation, governance and best‑practice delivery across a modern BI ecosystem.

? What You’ll Do
Semantic Modelling with PBIP + Git
  • Build and maintain enterprise PBIP datasets fully version‑controlled in Git.
  • Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance.
  • Manage branching, pull requests and releases via Azure DevOps.
Lakehouse‑Aligned Reporting (Gold Layer Only)
  • Develop semantic models exclusively on top of curated Gold Databricks tables.
  • Work closely with Data Engineering on schema design and contract‑first modelling.
  • Maintain consistent dimensional modelling aligned to the enterprise Bus Matrix.
High‑Performance Power BI Engineering
  • Optimise performance: aggregations, composite models, incremental refresh, DQ/Import strategy.
  • Tune Databricks SQL Warehouse queries for speed and cost efficiency.
  • Monitor PPU capacity performance, refresh reliability and dataset health.
Governance, Security & Standards
  • Implement RLS/OLS, naming conventions, KPI definitions and calc groups.
  • Apply dataset certification, endorsements and governance metadata.
  • Align semantic models with lineage and security policies across the Azure/Databricks estate.
Lifecycle, Release & Best Practice Delivery
  • Use Power BI Deployment Pipelines for Dev ? UAT ? Prod releases.
  • Enforce semantic CI/CD patterns with PBIP + Git + Tabular.
  • Build reusable, certified datasets and dataflows enabling scalable self‑service BI.
Adoption, UX & Collaboration
  • Design intuitive dashboards with consistent UX across multiple business functions.
  • Support BI adoption through training, documentation and best‑practice guidance.
  • Use telemetry to track usage, performance and improve user experience.
? What We For
Required Certifications

To meet BI engineering standards, candidates must hold:

  • PL-300: Power BI Data Analyst Associate
  • DP-600: Fabric Analytics Engineer Associate
Skills & Experience
  • 3‑5+ years building enterprise Power BI datasets and dashboards.
  • Strong DAX and semantic modelling expertise (calc groups, conformed dimensions, role‑playing dimensions).
  • Strong SQL skills; comfortable working with Databricks Gold‑layer tables.
  • Proven ability to optimise dataset performance (aggregations, incremental refresh, DQ/Import).
  • Experience working with Git‑based modelling workflows and PR reviews via Tabular Editor.
  • Excellent design intuition‑clean layouts, drill paths, and KPI logic.
Nice to Have
  • Python for automation or ad‑hoc prep; PySpark familiarity.
  • Understanding of Lakehouse patterns, Delta Lake, metadata‑driven pipelines.
  • Unity Catalog / Purview experience for lineage and governance.
  • RLS/OLS implementation experience.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.