Overview
We are seeking a highly experienced Databricks Architect to lead the design, development, and optimization of scalable data platforms and advanced analytics solutions. This role requires a deep understanding of Databricks Lakehouse architecture, big data engineering, cloud ecosystems, and enterprise data strategy.
Responsibilities
- Architect end-to-end Databricks Lakehouse solutions including data ingestion, ETL/ELT pipelines, Delta Lake, data warehousing, and data governance frameworks.
- Define and enforce best practices, security standards, and performance optimization across Databricks workloads.
- Design scalable big data architectures leveraging Spark, Delta Live Tables, Unity Catalog, and MLflow.
- Lead cloud architecture design on Azure/AWS/GCP integrated with Databricks.
- Build and optimize large-scale ETL/ELT pipelines using PySpark, SQL, and Delta.
- Oversee data quality frameworks, metadata management, lineage, and monitoring.
- Work closely with business leaders and product teams to translate requirements into robust technical solutions.
- Guide development teams, perform architecture reviews, and ensure platform engineering excellence.
- Conduct technical workshops, POCs, and roadmap planning for the Databricks environment.
- Optimize Databricks clusters, query performance, and cost management.
- Implement data governance standards using Unity Catalog, RBAC, and compliance guidelines (GDPR, ISO).
- Ensure resilience, scalability, and disaster recovery readiness.
Requirements
- Experience: 10‑15 years overall in data engineering or data architecture; 5+ years of hands‑on Databricks architecture experience.
- Technical expertise: Expert in Apache Spark, PySpark, Databricks SQL, Delta Lake, and distributed systems; strong understanding of Lakehouse architecture, BI ecosystems, and modern data platforms.
- Cloud expertise: Azure, AWS, GCP (Azure preferred for many UK clients).
- Tools & practices: Experience with CI/CD, Infrastructure‑as‑Code (Terraform preferred), and DevOps.
- Leadership: Proven experience leading teams and delivering enterprise‑grade data solutions.
- Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional, Architect) desirable.
- ML: Experience building ML solutions using MLflow or integrating with cloud ML services.
- Domain: Experience in BFSI, Retail, Telecom, or Healthcare data environments.
- Realtime: Experience with real‑time data processing (Kafka, EventHub, Kinesis).
Soft Skills
- Excellent communication and stakeholder management skills.
- Strong problem‑solving and solution‑oriented mindset.
- Ability to lead in a fast‑paced and dynamic environment.
- Collaborative, proactive, and able to provide strong technical direction.
Contract
Duration: 6 months. Location: London (hybrid). Reporting to the Director of Architecture within the Risk Intelligence team.