Enable job alerts via email!

Databricks Architect

AVANADE ASIA PTE LTD

Kuala Lumpur

On-site

MYR 200,000 - 250,000

Full time

15 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology consulting firm in Kuala Lumpur is seeking a skilled Databricks Solution Architect to lead the design and implementation of scalable data platforms. The role requires significant experience with Databricks and a strong architectural background. Responsibilities include data governance, stakeholder collaboration, and integrating cloud services. Ideal candidates should have 10+ years in data architecture and proficiency in Python and SQL.

Qualifications

  • 10+ years of experience in data architecture or related role.
  • 5+ years designing large-scale data solutions on Databricks.
  • Expertise in performance tuning techniques.

Responsibilities

  • Define technical architecture for the enterprise data platform.
  • Design and implement Lakehouse architecture.
  • Lead integration of Databricks with core cloud services.

Skills

Data architecture
Data engineering
Lakehouse Architecture
Delta Lake
Apache Spark
Python/PySpark
Scala
SQL
Cloud platform security
Data modeling

Tools

Terraform
Job description

Add expected salary to your profile for insights

The Databricks Solution Architect is a senior technical leadership role responsible for defining, designing, and overseeing the implementation of scalable, secure, and high-performance data platforms using the Databricks Lakehouse Platform. This individual will translate business strategy into technical architecture, guide development teams, and ensure all data solutions align with enterprise standards for governance, security, and cost optimization.

Key Responsibilities
  • Architectural Leadership: Define and own the end‑to‑end technical architecture and roadmap for the enterprise data platform on Databricks, including data ingestion, transformation, storage, and consumption layers.
  • Lakehouse Design: Design and champion the implementation of the Lakehouse architecture utilizing Delta Lake, Databricks Unity Catalog, and Databricks SQL Warehouse to support all data, analytics, and AI/ML initiatives.
  • Data Governance & Security: Architect and enforce enterprise‑level data governance, security, and access control policies using Unity Catalog (e.g., fine‑grained access, lineage tracking, auditing).
  • Technical Guidance & Mentorship: Provide technical leadership, guidance, and mentorship to a team of Databricks Engineers. Conduct architectural reviews, code audits, and ensure adherence to best practices and standards.
  • Performance and Cost Optimization: Define strategies for, and lead, major performance tuning and cost optimization initiatives across all Databricks workloads, clusters, and Delta Lake storage.
  • Cloud Integration: Lead the integration of Databricks with core cloud services (AWS, Azure, or GCP) and other enterprise systems (e.g., data catalogs, BI tools, ML platforms).
  • Stakeholder Collaboration: Engage with executive stakeholders, data scientists, and business leaders to understand complex requirements and translate them into robust, scalable technical designs and solution blueprints.
  • DataOps and Automation Strategy: Define the DevOps/DataOps strategy for the Databricks environment, including continuous integration/continuous delivery (CI/CD) pipelines and Infrastructure‑as‑Code (IaC) using tools like Terraform or Databricks Asset Bundles.
  • Innovation: Evaluate new Databricks features, open‑source technologies (e.g., MLflow), and industry trends to drive continuous platform improvement and competitive advantage.
Qualifications
  • 10+ years of experience in data architecture, data engineering, or a related senior technical role.
  • 5+ years of deep, hands‑on experience designing and implementing large‑scale data solutions on the Databricks Platform.
  • Expertise in Lakehouse Architecture, Delta Lake, Apache Spark, and performance tuning techniques.
  • Proven experience implementing Databricks Unity Catalog for centralized governance.
  • Deep proficiency in Python/PySpark or Scala and advanced SQL.
  • Extensive experience with one major cloud platform and its security and data services.
  • Strong experience with data modeling (dimensional, data vault) and data warehouse concepts.
  • Demonstrated ability to lead technical discussions, document architectural designs, and communicate complex concepts to both technical and non‑technical audiences.
  • Preferred: Experience with real‑time/streaming data architectures (Structured Streaming).
  • Preferred: Experience with MLOps practices, including MLflow for model lifecycle management.
  • Preferred: Expertise in Infrastructure‑as‑Code tools (e.g., Terraform) and automated CI/CD for Databricks.
  • Preferred: Databricks Certified Data Engineer Professional or Databricks Certified Machine Learning Professional or equivalent cloud architecture certification.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.