Enable job alerts via email!

Data Architect

ZipRecruiter

Derby

Hybrid

GBP 60,000 - 75,000

Part time

7 days ago
Be an early applicant

Job summary

A dynamic tech consultancy is seeking a Data Architect (Databricks SME) to help design and optimize a scalable data platform in Derby. The role involves hands-on development and architecture, requiring expertise in Databricks and Azure services. Candidates should have solid skills in Spark, Python, and SQL. This contract position is likely to extend beyond the year, offering a collaborative work environment.

Qualifications

  • In-depth experience with Databricks Lakehouse architecture.
  • Strong knowledge of Azure services.
  • Solid hands-on skills in Spark, Python, and SQL.

Responsibilities

  • Design and implement Databricks Lakehouse architecture.
  • Develop ETL/ELT pipelines using Spark, Python, and SQL.
  • Integrate with Azure services and BI tools.

Skills

Databricks expertise
Azure services knowledge
Hands-on skills with Spark
Proficiency in Python
Solid SQL skills
Understanding of data governance
CI/CD familiarity
Infrastructure as Code knowledge
Communication skills

Tools

Databricks
Azure Data Lake
Azure Data Factory
Power BI
Git
Terraform

Job description

Job Description

Data Architect (Databricks SME)

Location - Derby (1/2 days a month)

Type - Contract (End of the year - likely to extend)

Role Overview:

I'm looking for a Databricks Specialist with strong experience in the Azure ecosystem to help design, build, and optimise a scalable data platform. This role combines hands-on development with architecture, governance, and team support to ensure a high-performing and secure environment. The role reports to a project delivery lead and works closely with internal technical teams.

Key Responsibilities:

  • Design and implement Databricks Lakehouse architecture (Delta Lake, Unity Catalog, etc.)
  • Develop ETL/ELT pipelines using Spark, Python, SQL, and Databricks workflows
  • Integrate with Azure services and BI tools (e.g., Power BI)
  • Optimise performance and support CI/CD and MLOps pipelines
  • Enable knowledge transfer through code reviews, training, and reusable templates

Key Skills:

  • In-depth experience with Databricks (Delta Lake, Unity Catalog, Lakehouse architecture).
  • Strong knowledge of Azure services (e.g. Data Lake, Data Factory, Synapse).
  • Solid hands-on skills in Spark, Python, PySpark, and SQL.
  • Understanding of data modelling, governance, and BI integration.
  • Familiarity with CI/CD, Git, and Infrastructure as Code (e.g. Terraform).
  • Excellent communication and mentoring skills.

Desirable:

  • Experience with Qlik or similar BI tools.
  • Strong data analysis and SQL abilities.

If this sounds like something you are interested in, please get in contact: thomas.deakin@spgresourcing.com SPG Resourcing is an equal opportunities employer and is committed to fostering an inclusive workplace which values and benefits from the diversity of the workforce we hire. We offer reasonable accommodation at every stage of the application and interview process.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs