Enable job alerts via email!

Data Engineer

UK Tote Group

Greater Manchester

On-site

GBP 50,000 - 70,000

Full time

Today
Be an early applicant

Job summary

A leading betting company in Greater Manchester seeks a Data Engineer to design and build data pipelines on AWS using Databricks. You'll work with real-time data, collaborating with various teams to deliver trusted insights. A strong background in Apache Spark, Delta Lake, and Python is required. The role offers competitive salary and benefits including a bonus scheme and 33 days holiday.

Benefits

Competitive Basic Salary
Discretionary Bonus Scheme
Company Shares Option Plan
Contributory pension scheme
Life insurance (4 x basic salary)
Simply Health Cash Plan
Holiday entitlement (33 days inclusive of bank holidays)
Study Support
Confidential 24/7 employee assistance helpline
Agile office environment with free parking

Qualifications

  • Proven expertise in building pipelines using Databricks.
  • Strong grasp of Apache Spark, particularly in PySpark or Scala.
  • Experience with real-time data ingestion and Kafka (MSK).
  • Deep understanding of Delta Lake and Medallion Architecture.
  • Proficient in Python and SQL for data engineering.

Responsibilities

  • Build streaming and batch data pipelines on AWS using Apache Spark.
  • Design, build, and optimise data pipelines using Databricks.
  • Ingest data from Kafka and AWS S3 into the Lakehouse.
  • Collaborate on high-performance dashboards with BI teams.
  • Ensure compliance with GDPR and Gambling regulations.

Skills

Data pipelines
Apache Spark
Delta Lake
AWS Services
Python
SQL
Streaming architectures
Data governance frameworks
Collaboration
CI/CD pipelines

Tools

Databricks
Kafka (MSK)
Power BI
Terraform
Job description
Overview

At UK Tote Group, we're on a mission to reimagine the future of pool betting - building a modern, data-driven betting experience for millions of racing fans. Our technology powers real-time insights, supports responsible gaming, and helps us deliver trusted, customer-first products across the UK and international markets. As a Data Engineer, you'll play a key role in designing, building, and optimising the Databricks-based Lakehouse that drives real-time data, analytics, and reporting across the Tote.

What you'll be doing

You'll build streaming and batch data pipelines on AWS using Apache Spark Structured Streaming, Delta Live Tables (DLT), and Kafka (MSK), ensuring our business teams across Liquidity, Product, Marketing, Finance, and Compliance have fast, trusted data at their fingertips. This is a hands-on engineering role where you'll collaborate across engineering, BI, and product teams to deliver scalable, secure, and governed data solutions under Unity Catalogue.

Responsibilities:

  • Design, build, and optimise data pipelines using Databricks, Spark Structured Streaming, and Delta Live Tables to ensure data flows efficiently and reliably across the organisation.
  • Develop robust Bronze, Silver, and Gold Delta tables using the Medallion Architecture that support analytics, APIs, and decision-making tools.
  • Ingest data from Kafka (MSK), AWS S3, and external APIs, ensuring seamless ingestion into the Lakehouse.
  • Collaborate with BI teams to enable high-performance Power BI dashboards through Databricks SQL Warehouses, making data accessible and actionable.
  • Govern, discover, and secure data under Unity Catalog; contribute to CI/CD pipelines for Databricks jobs, notebooks, and DLT workflows.
  • Monitor, tune, and troubleshoot pipeline performance using Databricks metrics, CloudWatch, and AWS Cost Explorer.
  • Document data models, schemas, and lineage; maintain understanding of data flows and dependencies.
  • Work with Compliance and Technology to ensure the platform remains compliant with GDPR and Gambling Commission regulations.
  • Champion best practices in data platform design, observability, and cost management to shape the Tote's data ecosystem.
What we are looking for

We're looking for an experienced Data Engineer with proven expertise in building pipelines in Databricks and a strong grasp of Apache Spark, whether in PySpark or Scala, including Structured Streaming. You should have experience with Kafka (MSK) and real-time data ingestion, as well as a deep understanding of Delta Lake, Delta Live Tables, and the Medallion Architecture. A strong AWS background is important, particularly with services such as S3, Glue, Lambda, Batch, and IAM.

You'll need to be proficient in Python and SQL for data engineering and analytics, and comfortable implementing CI/CD pipelines using tools such as GitHub Actions, Azure DevOps, or Jenkins. Solid experience with Git, version control, and Spark performance tuning will help you succeed in this role. Most importantly, you'll bring a collaborative, proactive attitude and an ability to balance platform reliability with the pace of delivery.

It would be an advantage if you have experience working with streaming architectures or data governance frameworks like Unity Catalog. Familiarity with Power BI, Looker, or Tableau is desirable, as is exposure to Databricks REST APIs, Airflow, or Databricks Workflows. Knowledge of infrastructure-as-code tools such as Terraform, AWS networking fundamentals, and cost management techniques using Photon and DBU monitoring will also be beneficial.

You'll be analytical, detail-oriented, and motivated by solving complex data challenges. A self-starter by nature, you take ownership of your work and enjoy designing end-to-end solutions that deliver measurable value. You're a great communicator who can explain data concepts clearly to both technical and non-technical colleagues, and you have a passion for automation, efficiency, and data quality. Most of all, you're curious and committed to continuous learning within a fast-evolving cloud data landscape.

What's in it for you?

At the Tote you can expect a friendly working environment with a strong sense of teamwork and pride in what we do. Within this role you'll develop a broad range of skills and experiences that can enhance your career at the Tote. Additionally, our company benefits package includes;

  • Competitive Basic Salary
  • Discretionary Bonus Scheme
  • Company Shares Option Plan
  • Contributory pension scheme
  • Life insurance (4 x basic salary)
  • Simply Health Cash Plan
  • Holiday entitlement (33 days inclusive of bank holidays)
  • Study Support and opportunity for progression and development
  • Confidential 24/7 365 employee assistance helpline
  • Agile and collaborative office environment with free parking, fruit, biscuits, and drinks

Regular social events, charity events and volunteering opportunities

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.