Enable job alerts via email!

Data Engineer

Somerset Bridge

Newcastle upon Tyne

Hybrid

GBP 55,000 - 69,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in insurance is looking for a Data Engineer to enhance their Azure data platform. This role involves designing and optimizing data pipelines, ensuring compliance with industry regulations, and collaborating with cross-functional teams to enable AI-driven analytics and automation.

Benefits

Hybrid working - 2 days in the office and 3 days working from home
25 days annual leave, increasing with service
Discretionary annual bonus
Pension scheme
Healthcare Cash Plan
Electric vehicle salary sacrifice scheme
Professional wellbeing and fitness app
Enhanced parental leave
Life Assurance
Employee Referral Scheme

Qualifications

  • Hands-on experience in building ELT pipelines using Azure Databricks.
  • Strong proficiency in SQL for data extraction, transformation, and optimisation.
  • Experience with data warehousing concepts and relational database design.

Responsibilities

  • Design, build, and maintain scalable ELT pipelines using Azure Databricks.
  • Collaborate with stakeholders to deliver data-driven solutions.
  • Implement data models to support analytics and machine learning.

Skills

SQL (T-SQL, Spark SQL)
Data Pipeline Development
Azure Databricks
Data Modelling
Data Quality
Python (PySpark)
Cloud Data Engineering
CI/CD Pipelines

Education

Degree in Computer Science or related field

Tools

Azure Data Factory
Databricks Workflows
Delta Lake
Azure Synapse Analytics

Job description

Data Engineer

Application Deadline: 27 June 2025

Department: [SBSS] Enterprise Data Management

Employment Type: Permanent - Full Time

Location: Newcastle

Reporting To: Mike Jolley

Compensation: £55,000 - £68,500 / year


Description

We're building something special — and we need a talented Data Engineer to help bring our Azure data platform to life.

This is your chance to work on a greenfield Enterprise Data Warehouse programme in the insurance sector, shaping data pipelines and platforms that power smarter decisions, better pricing, and sharper customer insights.

The Data Engineer will design, build, and optimise scalable data pipelines within Azure Databricks, ensuring high-quality, reliable data is available to support pricing, underwriting, claims, and operational decision-making. This role is critical in modernising SBG’s cloud-based data infrastructure, ensuring compliance with FCA/PRA regulations, and enabling AI-driven analytics and automation.

By leveraging Azure-native services, such as Azure Data Factory (ADF) for orchestration, Delta Lake for ACID-compliant data storage, and Databricks Structured Streaming for real-time data processing, the Data Engineer will help unlock insights, enhance pricing accuracy, and drive innovation. The role also includes optimising Databricks query performance, implementing robust security controls (RBAC, Unity Catalog), and ensuring enterprise-wide data reliability.

Working closely with Data Architects, Pricing Teams, Data Analysts, and IT, this role will ensure our Azure Databricks data ecosystem is scalable, efficient, and aligned with business objectives. Additionally, the Data Engineer will contribute to cost optimisation, governance, and automation within Azure’s modern data platform.


Key Responsibilities
  • Data Pipeline Development – Design, build, and maintain scalable ELT pipelines using Azure Databricks, Azure Data Factory (ADF), and Delta Lake to automate real-time and batch data ingestion.
  • Cloud Data Engineering – Develop and optimise data solutions within Azure, ensuring efficiency, cost-effectiveness, and scalability, leveraging Azure Synapse Analytics, ADLS Gen2, and Databricks Workflows
  • Data Modelling & Architecture – Implement robust data models to support analytics, reporting, and machine learning, using Delta Lake and Azure Synapse.
  • Automation & Observability – Use Databricks Workflows, dbt, and Azure Monitor to manage transformations, monitor query execution, and implement data reliability checks.
  • Data Quality & Governance – Ensure data integrity, accuracy, and compliance with industry regulations (FCA, Data Protection Act, PRA) using Databricks Unity Catalog and Azure Purview.
  • Collaboration & Stakeholder Engagement – Work closely with Data Scientists, Pricing, Underwriting, and IT to deliver data-driven solutions aligned with business objectives.
  • Data Governance & Security – Implement RBAC, column-level security, row-access policies, and data masking to protect sensitive customer data and ensure FCA/PRA regulatory compliance.
  • Innovation & Continuous Improvement – Identify and implement emerging data technologies within the Azure ecosystem, such as Delta Live Tables (DLT), Structured Streaming, and AI-driven analytics to enhance business capabilities.

Skills, Knowledge and Expertise
  • Hands-on experience in building ELT pipelines and working with large-scale datasets using Azure Data Factory (ADF) and Databricks.
  • Strong proficiency in SQL (T-SQL, Spark SQL) for data extraction, transformation, and optimisation.
  • Proficiency in Azure Databricks (PySpark, Delta Lake, Spark SQL) for big data processing.
  • Knowledge of data warehousing concepts and relational database design, particularly with Azure Synapse Analytics.
  • Experience working with Delta Lake for schema evolution, ACID transactions, and time travel in Databricks.
  • Strong Python (PySpark) skills for big data processing and automation.
  • Experience with Scala (optional but preferred for advanced Spark applications).
  • Experience working with Databricks Workflows & Jobs for data orchestration.
  • Strong knowledge of feature engineering and feature stores, particularly in Databricks Feature store for ML training and inference.
  • Experience with data modelling techniques to support analytics and reporting.
  • Familiarity with real-time data processing and API integrations (e.g., Kafka, Spark Streaming).
  • Proficiency in CI/CD pipelines for data deployment using Azure DevOps, GitHub Actions, or Terraform for Infrastructure as Code (IaC).
  • Understanding of MLOps principles, including continuous integration (CI), continuous delivery (CD), and continuous training (CT) for machine learning models.
  • Experience with performance tuning and query optimisation for efficient data workflows.
  • Strong understanding of query optimisation techniques in Databricks (caching, partitioning, indexing, and auto-scaling clusters).
  • Experience monitoring Databricks workloads using Azure Monitor, Log Analytics, and Databricks Performance Insight
  • Familiarity with cost optimization strategies in Databricks and ADLS Gen2 (e.g., managing compute resources efficiently).
  • Problem-solving mindset – Ability to diagnose issues and implement efficient solution
  • Experience implementing Databricks Unity Catalog for data governance, access control, and lineage tracking.
  • Understanding of Azure Purview for data cataloging and metadata management.
  • Familiarity with object-level and row-level security in Azure Synapse and Databricks
  • Experience working with Azure Event Hubs, Azure Data Explorer, or Kafka for real-time data streaming.
  • Hands-on experience with Databricks Structured Streaming for real-time and near-real-time data processing.
  • Understanding of Delta Live Tables (DLT) for automated ELT and real-time transformations.
  • Analytical thinking – Strong ability to translate business needs into technical data solution
  • Attention to detail – Ensures accuracy, reliability, and quality of data.
  • Communication skills – Clearly conveys technical concepts to non-technical stakeholders.
  • Collaboration – Works effectively with cross-functional teams, including Pricing, Underwriting, and IT.
  • Adaptability – Thrives in a fast-paced, agile environment with evolving priorities.
  • Stakeholder management – Builds strong relationships and understands business requirements
  • Innovation-driven – Stays up to date with emerging technologies and industry trends.

Our Benefits
  • Hybrid working – 2 days in the office and 3 days working from home
  • 25 days annual leave, rising to 27 days over 2 years’ service and 30 days after 5 years’ service. Plus bank holidays!
  • Discretionary annual bonus
  • Pension scheme – 5% employee, 6% employer
  • Flexible working – we will always consider applications for those who require less than the advertised hours
  • Flexi-time
  • Healthcare Cash Plan – claim cashback on a variety of everyday healthcare costs
  • Electric vehicle – salary sacrifice scheme
  • 100’s of exclusive retailer discounts
  • Professional wellbeing, health & fitness app - Wrkit
  • Enhanced parental leave, including time off for IVF appointments
  • Religious bank holidays – if you don’t celebrate Christmas and Easter, you can use these annual leave days on other occasions throughout the year.
  • Life Assurance - 4 times your salary
  • 25% Car Insurance Discount
  • 20% Travel Insurance Discount
  • Cycle to Work Scheme
  • Employee Referral Scheme
  • Community support day
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Data Engineer - Newcastle

JR United Kingdom

Sunderland

Remote

GBP 40,000 - 60,000

4 days ago
Be an early applicant

Senior Machine Learning Engineer

JR United Kingdom

Newcastle upon Tyne

Remote

GBP 60,000 - 100,000

4 days ago
Be an early applicant

Senior Big Data Engineer

Gaming Innovation Group

Newcastle upon Tyne

Remote

GBP 50,000 - 70,000

21 days ago

Data Engineer

JR United Kingdom

Shrewsbury

Remote

GBP 60,000 - 60,000

Yesterday
Be an early applicant

Data Engineer

JR United Kingdom

Preston

Remote

GBP 50,000 - 60,000

Today
Be an early applicant

Data Engineer

JR United Kingdom

Cheltenham

Remote

GBP 60,000 - 60,000

Today
Be an early applicant

Data Engineer

JR United Kingdom

Plymouth

Remote

GBP 60,000 - 70,000

Today
Be an early applicant

Data Engineer

Eleven Labs Inc.

Remote

GBP 45,000 - 65,000

Today
Be an early applicant

Data Engineer United Kingdom

ElevenLabs

Remote

GBP 40,000 - 75,000

Today
Be an early applicant