Job Search and Career Advice Platform

Enable job alerts via email!

Databricks Architect

Boston Consulting Group

City of Westminster

Hybrid

GBP 70,000 - 90,000

Part time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading consulting firm is seeking a highly experienced Databricks Architect in London (hybrid). The role involves leading the design and optimization of scalable data platforms, requiring at least 5 years of specific Databricks architecture experience. Ideal candidates will have a strong background in Apache Spark, big data engineering, and cloud ecosystems, along with excellent leadership and communication skills. This position reports to the Director of Architecture within the Risk Intelligence team.

Qualifications

  • 10-15 years in data engineering or architecture; 5+ years hands-on Databricks experience.
  • Expert in Apache Spark, PySpark, and Databricks SQL.
  • Strong understanding of Lakehouse architecture and BI ecosystems.
  • Experience with CI/CD tools and DevOps practices.
  • Proven leadership in delivering enterprise-grade data solutions.

Responsibilities

  • Architect end-to-end Databricks Lakehouse solutions.
  • Define and enforce security standards for Databricks workloads.
  • Lead cloud architecture design on Azure/AWS/GCP.
  • Build and optimize large-scale ETL/ELT pipelines.
  • Implement data governance standards using Unity Catalog.

Skills

Apache Spark
PySpark
Databricks SQL
Delta Lake
Cloud architecture (Azure/AWS/GCP)
CI/CD
Infrastructure-as-Code (Terraform)
Machine Learning (MLflow)
Real-time data processing (Kafka/EventHub/Kinesis)
Job description
Overview

We are seeking a highly experienced Databricks Architect to lead the design, development, and optimization of scalable data platforms and advanced analytics solutions. This role requires a deep understanding of Databricks Lakehouse architecture, big data engineering, cloud ecosystems, and enterprise data strategy.

Responsibilities
  • Architect end-to-end Databricks Lakehouse solutions including data ingestion, ETL/ELT pipelines, Delta Lake, data warehousing, and data governance frameworks.
  • Define and enforce best practices, security standards, and performance optimization across Databricks workloads.
  • Design scalable big data architectures leveraging Spark, Delta Live Tables, Unity Catalog, and MLflow.
  • Lead cloud architecture design on Azure/AWS/GCP integrated with Databricks.
  • Build and optimize large-scale ETL/ELT pipelines using PySpark, SQL, and Delta.
  • Oversee data quality frameworks, metadata management, lineage, and monitoring.
  • Work closely with business leaders and product teams to translate requirements into robust technical solutions.
  • Guide development teams, perform architecture reviews, and ensure platform engineering excellence.
  • Conduct technical workshops, POCs, and roadmap planning for the Databricks environment.
  • Optimize Databricks clusters, query performance, and cost management.
  • Implement data governance standards using Unity Catalog, RBAC, and compliance guidelines (GDPR, ISO).
  • Ensure resilience, scalability, and disaster recovery readiness.
Requirements
  • Experience: 10‑15 years overall in data engineering or data architecture; 5+ years of hands‑on Databricks architecture experience.
  • Technical expertise: Expert in Apache Spark, PySpark, Databricks SQL, Delta Lake, and distributed systems; strong understanding of Lakehouse architecture, BI ecosystems, and modern data platforms.
  • Cloud expertise: Azure, AWS, GCP (Azure preferred for many UK clients).
  • Tools & practices: Experience with CI/CD, Infrastructure‑as‑Code (Terraform preferred), and DevOps.
  • Leadership: Proven experience leading teams and delivering enterprise‑grade data solutions.
  • Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional, Architect) desirable.
  • ML: Experience building ML solutions using MLflow or integrating with cloud ML services.
  • Domain: Experience in BFSI, Retail, Telecom, or Healthcare data environments.
  • Realtime: Experience with real‑time data processing (Kafka, EventHub, Kinesis).
Soft Skills
  • Excellent communication and stakeholder management skills.
  • Strong problem‑solving and solution‑oriented mindset.
  • Ability to lead in a fast‑paced and dynamic environment.
  • Collaborative, proactive, and able to provide strong technical direction.
Contract

Duration: 6 months. Location: London (hybrid). Reporting to the Director of Architecture within the Risk Intelligence team.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.