Job Search and Career Advice Platform

Enable job alerts via email!

DataBricks Data Engineer DPP

Lucas Group

Greater London

Hybrid

GBP 70,000 - 90,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading recruitment firm is seeking a Senior Databricks Data Engineer to work on scalable data platforms and high-performance analytics solutions. The role is based in London on a hybrid model with an initial contract of 6 months. Responsibilities include developing data pipelines, optimizing solutions, and mentoring junior engineers. Candidates should have hands-on experience with Databricks, Python, and SQL, as well as strong stakeholder communication skills. A Databricks certification is a plus.

Qualifications

  • Strong hands-on experience with Databricks, Python, SQL, and PySpark.
  • Solid understanding of Delta Lake, Unity Catalog, and MLflow.
  • Experience with DevOps tools such as Git and CI/CD.

Responsibilities

  • Develop and optimise data pipelines and lakehouse solutions using Databricks.
  • Work across AWS, Azure, or GCP environments.
  • Implement best practices in data engineering, CI/CD, and testing.
  • Collaborate with partner and client teams on solution design and delivery.
  • Mentor junior engineers and contribute to reusable frameworks and accelerators.

Skills

Databricks
Python
SQL
PySpark
DevOps tools
Excellent communication
Stakeholder skills

Tools

Git
CI / CD
Infrastructure as Code (IaC)
Job description
Overview

Senior Databricks Data Engineer (DPP)

London Based (Hybrid)

6 months initial contract + ext

Outside IR35

We\'re looking for a Senior Databricks Data Engineer who has DPP Status (Databricks Partner Program). You\'ll design and build scalable data platforms, develop pipelines, and help deliver high-performance analytics solutions across cloud environments.

Responsibilities
  • Develop and optimise data pipelines and lakehouse solutions using Databricks.
  • Work across AWS, Azure, or GCP environments.
  • Implement best practices in data engineering, CI / CD, and testing.
  • Collaborate with partner and client teams on solution design and delivery.
  • Mentor junior engineers and contribute to reusable frameworks and accelerators.
Skills & Experience
  • Strong hands-on experience with Databricks, Python, SQL, and PySpark.
  • Solid understanding of Delta Lake, Unity Catalog, and MLflow.
  • Experience with DevOps tools (Git, CI / CD, IaC).
  • Excellent communication and stakeholder skills.
  • Databricks certification and partner or consulting experience are a plus

About Korn Ferry

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.