This role is part of Yoti's BI engineering team, and the primary responsibility of this role is to build and maintain robust, scalable data models and a semantic layer that allows analysts and data scientists to conduct their own analyses. Our data, although high in integrity, is collected from a diverse set of platforms and services across Yoti. The BI Data Modelling Engineer will design data models and semantic structures (dimensions, views, measures using LookML and Airflow) to unify this data and enable self‑service reporting in Looker. Alongside data modelling, you will be involved in monitoring, optimising, and enhancing the BI pipeline and infrastructure.
Responsibilities
- Undertake a wide range of data modelling requirements with varying levels of complexity.
- Design and develop data models in LookML and Airflow, creating views, explores, joins, dimensions, and measures, to unify data from different services.
- Assist in defining and building data architecture for data warehouse and big data solutions (including fact and dimensional modelling).
- Support data engineering efforts related to ELT/ETL pipelines and ensure proper ingestion into the data warehouse (e.g., Redshift, Postgres).
- Monitor, optimise, and enhance the BI pipeline and infrastructure.
- Support the delivery and migration of new data where change remains a constant and a continuous improvement cycle regime prevails.
- Own and maintain model documentation and LookML/Airflow codebase in Git.
- Work with the analyst team to maximise the benefits of business intelligence and data analytics, in line with their business objectives.
- Provide input into data management practices, including data classification, data retention, data usage, data destruction, and other aspects of data usage across technical and non‑technical solutions.
- Create well‑structured and descriptive code and documentation for data architecture, processes, and reporting.
- Support Business As Usual (BAU) activities, including troubleshooting, performance monitoring, and data validation.
- Create and maintain SQL Server jobs, SSIS packages, and stored procedures.
- Identify, communicate, and resolve data quality and data reconciliation issues.
- Troubleshoot and diagnose performance issues.
Requirements
Essential Skills
- Git code management.
- Excellent command of SQL for relational databases like Redshift and Postgres.
- Familiarity with BI tools such as Looker, Power BI.
- Knowledge of data, master data, and metadata‑related standards, processes, and technology.
Desirable Skills
- Experience with Python and Airflow.
- Knowledge of AWS.
- Strong communication skills with both business and engineering stakeholders, including the ability to translate technical findings into actionable insights or requirements for non‑technical users.
- Ability to communicate with nontechnical business users to determine specific business requirements for reports and business intelligence solutions.
- Strong troubleshooting skills and the ability to diagnose performance issues effectively.