Job Search and Career Advice Platform

Enable job alerts via email!

Databricks Engineer - SC Cleared

iO Associates

Bristol

Remote

GBP 80,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading multinational organisation is seeking an SC Cleared Databricks Engineer to design and optimise scalable data solutions. This fully remote contract position offers £550 - £600 per day for a duration of 6 months. The ideal candidate will have extensive experience with Databricks, ETL/ELT tools, and cloud platforms such as AWS or Azure. Apply today to tackle complex data challenges and drive innovation.

Qualifications

  • SC Clearance needed for eligibility.
  • Expertise in designing data pipelines and analytics solutions.
  • Ability to collaborate effectively with team members.

Responsibilities

  • Design, develop, and optimise data pipelines.
  • Collaborate with data specialists for high-quality solutions.

Skills

Extensive experience with Databricks
Proficiency in ETL/ELT development
Hands-on experience with cloud platforms
Strong knowledge of SQL
Strong knowledge of Python
Strong knowledge of PySpark
Familiarity with CI/CD pipelines
Job description
Databricks Engineer - SC Cleared

Contract Details:

  • Location: London (Fully Remote)
  • Duration: 6 Months
  • Rate: £550 - 600 per day
  • IR35: Inside
  • Start Date: ASAP

Are you an SC Cleared Databricks Engineer with a passion for building scalable data solutions? Do you have deep expertise in designing and optimising data pipelines, cloud‑native platforms, and advanced analytics ecosystems?

We're working with a leading multinational organisation seeking a skilled Databricks Engineer to drive innovation and enable data‑driven decision‑making, advanced analytics, and AI capabilities.

Key Responsibilities
  • Design, develop, and optimise data pipelines and analytics solutions using Databricks in a secure environment.
  • Collaborate with data specialists to deliver efficient, high‑quality solutions.
Critical Skills
  • Extensive experience with Databricks (including Spark, Delta Lake, and MLflow).
  • Proficiency in ETL/ELT development and orchestration tools (DBT, Airflow, or similar).
  • Hands‑on experience with cloud platforms (AWS, Azure, or GCP).
  • Strong knowledge of SQL, Python, and PySpark for data processing.
  • Familiarity with CI/CD pipelines and DevOps practices for data solutions.

This role is perfect for someone who thrives on solving complex data challenges and spearheading innovation while ensuring efficient delivery of data solutions.

If this sounds like the right opportunity for you, apply today and let's talk!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.