Job Search and Career Advice Platform

Enable job alerts via email!

Databricks Data Engineer

REGTECH INSIGHT PTE. LTD.

Singapore

Hybrid

SGD 80,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology consulting firm in Singapore seeks a Senior Consultant – Databricks to design, develop, and operationalize Delta Lakehouse architectures. This contract role involves driving enterprise-level data and AI solutions, optimizing performance, and mentoring teams. Candidates should have strong expertise in Databricks, Python, and cloud technologies. Benefits include flexibility for on-site or remote work within the region, and opportunities to contribute to innovative AI projects.

Qualifications

  • Strong hands-on experience with Databricks, including workspace setup and clusters.
  • Expertise in Delta Lake and SQL Warehouses.
  • Proficiency in Python or Scala for data workflows.

Responsibilities

  • Design and implement scalable data pipelines using Delta Live Tables.
  • Optimize ETL, streaming, and ML workloads for performance.
  • Automate infrastructure and deployments using Terraform.

Skills

Databricks hands-on experience
SQL proficiency
Python
Scala
AWS
Azure
GCP

Tools

Terraform
Git
CI/CD
Splunk
Prometheus
CloudWatch
Job description

•We’re seeking a hands-on Senior Consultant – Databricks with deep technical expertise in building and optimizing Lakehouse-based data and AI solutions. This is a contract role based in Singapore, offering flexibility to work on-site with clients or remotely within the region.

•In this role, you’ll design, develop, and operationalize Delta Lakehouse architectures using Databricks, driving real-world outcomes for enterprise customers. You’ll take ownership of implementation tasks, lead technical delivery, and mentor engineering teams in best practices across data engineering, governance, and AI.

Key Responsibilities
  • Design and implement scalable data pipelines using Delta Live Tables (DLT), Spark SQL, Python, or Scala.
  • Optimize ETL, streaming, and ML workloads for performance, cost efficiency, and reliability.
  • Administer and configure Databricks Workspaces, Unity Catalog, and cluster policies for secure, governed environments.
  • Automate infrastructure and deployments using Terraform, Git, and CI/CD pipelines.
  • Implement observability, cost optimization, and monitoring frameworks using tools like Splunk, Prometheus, or CloudWatch.
  • Collaborate with customers to build AI and LLM solutions leveraging MLflow, DBRX, and Mosaic AI.
Required Skills & Experience
  • Strong hands-on experience with Databricks, including workspace setup, notebooks, clusters, and job orchestration.
  • Expertise in Delta Lake, DLT, Unity Catalog, and SQL Warehouses.
  • Proficiency in Python or Scala for data engineering and ML workflows.
  • Strong understanding of AWS, Azure, or GCP cloud ecosystems.
  • Experience with Terraform automation, DevOps, and MLOps practices.
  • Familiarity with monitoring and governance frameworks for large-scale data platforms.
Nice to Have
  • Experience developing AI/LLM pipelines and RAG architectures on Databricks.
  • Exposure to Bedrock, OpenAI, or Hugging Face integrations.
  • Databricks certifications (Data Engineer, Machine Learning, or Solutions Architect) preferred.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.