Job Search and Career Advice Platform

Enable job alerts via email!

Data Platform Technical Lead (Azure & Databricks)

SAGL CONSULTING PTE. LTD.

Singapore

Hybrid

SGD 90,000 - 120,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A consulting firm in Singapore is seeking a Data Platform Technical Lead to design and operate a cloud-based data platform on Azure Databricks. Responsibilities include leading platform design, building data pipelines, optimizing performance, and enforcing data engineering standards. The ideal candidate has at least 5 years of experience in data engineering with hands-on skills in Azure technologies. This hybrid role requires good communication and collaboration with stakeholders, alongside a flexible work arrangement.

Qualifications

  • Minimum 5 years of experience in data engineering or data platform development.
  • Strong hands-on experience with Azure Databricks, PySpark, Delta Lake, and Unity Catalog.
  • Proven experience with Azure Data Factory.

Responsibilities

  • Lead the technical design and evolution of an enterprise Azure Databricks data platform.
  • Design and implement data ingestion, transformation, and orchestration pipelines.
  • Build and maintain Lakehouse architecture.
  • Implement data governance, access control, and metadata management.

Skills

Azure Databricks
PySpark
Delta Lake
Unity Catalog
Azure Data Factory
CI/CD pipelines
Job description
Job Description

We are looking for a Data Platform Technical Lead to drive the design, build, and operation of a modern cloud-based data platform on Microsoft Azure and Databricks.

This role is ideal for a hands‑on senior data engineer who enjoys architecting scalable data solutions, setting engineering standards, and guiding teams on best practices—without being a people manager.

You will work closely with data engineers, application teams, and business stakeholders to deliver a secure, reliable, and high‑performing data platform that supports analytics and business decision‑making.

Key Responsibilities
  • Lead the technical design and evolution of an enterprise Azure Databricks data platform

  • Design and implement data ingestion, transformation, and orchestration pipelines

  • Build and maintain Lakehouse architecture (Bronze / Silver / Gold layers)

  • Implement data governance, access control, and metadata management using Unity Catalog

  • Collaborate with source system teams to deliver secure and reliable data integrations

  • Optimise platform performance, scalability, and cloud cost efficiency

  • Define and enforce data engineering standards, CI/CD practices, and operational guidelines

  • Troubleshoot complex platform issues and drive continuous improvements

  • Provide technical guidance and mentoring to data engineers and analysts

  • Translate business requirements into scalable technical data solutions

Requirements
  • Minimum 5 years of experience in data engineering or data platform development

  • Strong hands‑on experience with:

    • Azure Databricks
    • PySpark
    • Delta Lake
    • Unity Catalog
  • Proven experience with Azure Data Factory

  • Good understanding of distributed data processing and Lakehouse design

  • Experience building scalable ETL / ELT pipelines

  • Familiarity with CI/CD pipelines (Azure DevOps or GitHub Actions)

  • Strong communication and stakeholder engagement skills

    Working Hours: Monday to Friday, 9:00 AM – 6:00 PM . Work Arrangement: Hybrid (2 days onsite, 3 days remote). Work Location: Easily accessible to public transport, West Singapore

#DataEngineering #DataPlatform #AzureDatabricks #AzureDataFactory #UnityCatalog #PySpark #DeltaLake

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.