Enable job alerts via email!

Databricks Engineer - SC Cleared

iO Associates

England

Remote

GBP 80,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A technology recruitment firm is seeking an SC Cleared Databricks Engineer to design and optimize data pipelines fully remotely. The role involves extensive experience with Databricks and cloud platforms, ensuring efficient delivery of data solutions. This is a 6-month contract with a rate of £500-£550 per day, starting ASAP.

Qualifications

  • Extensive experience with Databricks, Spark, Delta Lake, MLflow.
  • Proficiency in ETL/ELT development and orchestration tools.
  • Hands-on experience with cloud platforms like AWS, Azure, or GCP.

Responsibilities

  • Design and develop data pipelines and analytics solutions using Databricks.
  • Collaborate with data specialists and innovate data solutions.
  • Ensure efficient delivery of data solutions.

Skills

Databricks (Spark, Delta Lake, and MLflow)
ETL/ELT development tools (DBT, Airflow)
Cloud platforms (AWS, Azure, GCP)
SQL
Python
PySpark
CI/CD pipelines
DevOps practices
Job description

Databricks Engineer - SC Cleared

Fully Remote

Duration: 6 Months

Location: London

Rate: £500-£550 per day (Inside IR35)

Start Date: ASAP

Are you an SC Cleared Databricks Engineer with a passion for building scalable data solutions? Do you have extensive experience in designing and optimising data pipelines, cloud-native platforms, and advanced analytics ecosystems? We are working with a leading multinational client seeking a skilled Databricks Engineer to drive innovation and enable data-driven decision‑making, advanced analytics, and AI capabilities.

The successful client will be responsible for designing, developing, and optimising data pipelines and analytics solutions using Databricks within a secure environment.

Critical Skills
  • Extensive experience with Databricks (Spark, Delta Lake, and MLflow).
  • Proficiency in ETL/ELT development and orchestration tools (DBT, Airflow, or similar).
  • Hands‑on experience with cloud platforms (AWS, Azure, or GCP).
  • Solid understanding of SQL, Python, and PySpark for data processing.
  • Familiarity with CI/CD pipelines and DevOps practices for data solutions.

You will be collaborating with a variety of data specialists. The role will require you to maintain a passion for solving complex data sets, spearheading innovation whilst ensuring efficient delivery of data solutions.

If this looks like the role for you, apply now and get in touch.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.