Job Search and Career Advice Platform

Enable job alerts via email!

Databricks Machine Learning Lead

Avanade

Kuala Lumpur

On-site

MYR 150,000 - 200,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology consulting firm in Kuala Lumpur is looking for a skilled Databricks M/L Technical Lead. This senior, hands-on position is responsible for designing, developing, and delivering scalable data solutions on the Databricks Lakehouse Platform. Candidates should have over 8 years of experience in machine learning or data science, including 4 years focused on Databricks, and must possess expert coding skills in Python and PySpark/Scala. The role demands strong leadership capabilities for a team of engineers and a deep understanding of data architecture principles.

Qualifications

  • 8+ years of experience in machine learning, data science, or MLOps, with at least 4+ years on the Databricks Lakehouse Platform.
  • Expert proficiency in Python and PySpark/Scala for large-scale data processing.
  • Deep understanding of Delta Lake architecture.
  • Proven expertise implementing MLOps principles using MLflow.

Responsibilities

  • Define and implement robust data architectures utilizing the Databricks ecosystem.
  • Write high-quality, production-grade code in PySpark/Scala and SQL.
  • Lead performance tuning and optimization for large-scale Spark jobs.
  • Enforce technical standards and frameworks for the engineering team.

Skills

Machine learning
Data science
MLOps
Python
PySpark
Scala
Delta Lake
MLflow

Tools

Databricks
Kafka
Terraform
Job description
The Databricks M/L Technical Lead

A senior, hands‑on role responsible for the design, development, and delivery of highly scalable, secure, and performant data solutions on the Databricks Lakehouse Platform. It is expected to provide technical leadership to a team of engineers, defining coding standards, implementing architectural patterns, and ensuring the delivery of high‑quality data products.

Key Responsibilities
  • Define and implement robust data architectures utilizing the Databricks ecosystem, including Delta Lake, Unity Catalog, Machine Learning Models and Databricks Workflows.
  • Serve as the most senior developer, writing high‑quality, production‑grade code in PySpark/Scala and SQL for complex batch and streaming ETL/ELT pipelines.
  • Lead performance tuning and optimization efforts for large‑scale Spark jobs, ensuring efficient cluster utilization and cost management.
  • Define and enforce technical standards, code quality, testing frameworks (unit, integration), and DataOps/CI/CD pipelines for the engineering team.
Qualifications
Required Skills & Experience
  • 8+ years of experience in machine learning, data science, or MLOps, with at least 4+ years focused specifically on the Databricks Lakehouse Platform.
  • Expert proficiency in Python and PySpark/Scala for large‑scale data processing and machine learning.
  • Deep understanding and practical experience with Delta Lake architecture and optimization techniques.
  • Proven expertise implementing MLOps principles using MLflow (Tracking, Registry, Projects, and Deployment).
Preferred Skills & Certifications
  • Experience with Databricks features like Databricks Workflows, and Unity Catalog.
  • Experience with streaming technologies (e.g., Kafka, Spark Streaming).
  • Familiarity with CI/CD tools and Infrastructure‑as‑Code (e.g., Terraform, Databricks Asset Bundles).
  • Databricks Certified Machine Learning Professional certified.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.