Job Search and Career Advice Platform

Enable job alerts via email!

Bi Data Modeling

Yoti

Greater London

Hybrid

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A data-driven technology firm in Greater London is looking for a BI Data Modelling Engineer to build and maintain robust data models. This role involves designing data models with LookML and Airflow, enhancing BI infrastructure, and supporting data engineering. Candidates should possess strong SQL skills, familiarity with BI tools, and experience in Git for code management. The firm promotes a collaborative approach, enabling analysts to conduct independent analyses and ensuring data integrity across various platforms.

Qualifications

  • Strong experience with Git for code management.
  • Proven SQL skills for working with relational databases.
  • Understanding of BI tools and data governance standards.

Responsibilities

  • Design and maintain data models and a semantic layer.
  • Monitor and enhance the BI pipeline and infrastructure.
  • Support data engineering for ELT/ETL pipelines.
  • Create and maintain SQL Server jobs and stored procedures.

Skills

Git code management
Excellent command of SQL
Familiarity with BI tools like Looker and Power BI
Knowledge of data-related standards

Tools

Looker
Power BI
Redshift
Postgres
Python
Airflow
Job description

This role is part of Yoti's BI engineering team, and the primary responsibility of this role is to build and maintain robust, scalable data models and a semantic layer that allows analysts and data scientists to conduct their own analyses. Our data, although high in integrity, is collected from a diverse set of platforms and services across Yoti. The BI Data Modelling Engineer will design data models and semantic structures (dimensions, views, measures using LookML and Airflow) to unify this data and enable self‑service reporting in Looker. Alongside data modelling, you will be involved in monitoring, optimising, and enhancing the BI pipeline and infrastructure.

Responsibilities
  • Undertake a wide range of data modelling requirements with varying levels of complexity.
  • Design and develop data models in LookML and Airflow, creating views, explores, joins, dimensions, and measures, to unify data from different services.
  • Assist in defining and building data architecture for data warehouse and big data solutions (including fact and dimensional modelling).
  • Support data engineering efforts related to ELT/ETL pipelines and ensure proper ingestion into the data warehouse (e.g., Redshift, Postgres).
  • Monitor, optimise, and enhance the BI pipeline and infrastructure.
  • Support the delivery and migration of new data where change remains a constant and a continuous improvement cycle regime prevails.
  • Own and maintain model documentation and LookML/Airflow codebase in Git.
  • Work with the analyst team to maximise the benefits of business intelligence and data analytics, in line with their business objectives.
  • Provide input into data management practices, including data classification, data retention, data usage, data destruction, and other aspects of data usage across technical and non‑technical solutions.
  • Create well‑structured and descriptive code and documentation for data architecture, processes, and reporting.
  • Support Business As Usual (BAU) activities, including troubleshooting, performance monitoring, and data validation.
  • Create and maintain SQL Server jobs, SSIS packages, and stored procedures.
  • Identify, communicate, and resolve data quality and data reconciliation issues.
  • Troubleshoot and diagnose performance issues.
Requirements
Essential Skills
  • Git code management.
  • Excellent command of SQL for relational databases like Redshift and Postgres.
  • Familiarity with BI tools such as Looker, Power BI.
  • Knowledge of data, master data, and metadata‑related standards, processes, and technology.
Desirable Skills
  • Experience with Python and Airflow.
  • Knowledge of AWS.
  • Strong communication skills with both business and engineering stakeholders, including the ability to translate technical findings into actionable insights or requirements for non‑technical users.
  • Ability to communicate with nontechnical business users to determine specific business requirements for reports and business intelligence solutions.
  • Strong troubleshooting skills and the ability to diagnose performance issues effectively.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.