Job Search and Career Advice Platform

Enable job alerts via email!

Resident Solution Architect

JEET ANALYTICS PTE. LTD.

Singapore

Hybrid

SGD 90,000 - 130,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data analytics firm in Singapore is seeking a Senior Architect specializing in Databricks to design and implement scalable Lakehouse-based data solutions. This contractual role allows flexibility for on-site or remote work. Responsibilities include optimizing data workflows, mentoring teams, and ensuring best practices in data engineering and governance. Candidates should have hands-on experience with Databricks, a strong understanding of cloud ecosystems, and proficiency in Python or Scala. This is an exciting opportunity to drive impactful AI solutions.

Qualifications

  • Hands-on experience with Databricks including workspaces, notebooks, and clusters.
  • Expertise in Delta Lake, DLT, and SQL Warehouses.
  • Proficiency in Python or Scala for data engineering tasks.

Responsibilities

  • Design and implement scalable data pipelines using Delta Live Tables, Spark SQL, Python, or Scala.
  • Optimize ETL, streaming, and ML workloads for performance.
  • Administer Databricks Workspaces and cluster policies.

Skills

Databricks Workspace Setup
Delta Lake
Python
Scala
AWS
Terraform Automation
MLOps Practices

Tools

Terraform
Git
CI/CD
Splunk
Prometheus
CloudWatch
Job description

We’re seeking a hands‑on Senior architect – Databricks with deep technical expertise in building and optimizing Lakehouse-based data and AI solutions. This is a contract role based in Singapore, offering flexibility to work on‑site with clients or remotely within the region.
In this role, you’ll design, develop, and operationalize Delta Lakehouse architectures using Databricks, driving real‑world outcomes for enterprise customers. You’ll take ownership of implementation tasks, lead technical delivery, and mentor engineering teams in best practices across data engineering, governance, and AI.

Key Responsibilities
  • Design and implement scalable data pipelines using Delta Live Tables (DLT), Spark SQL, Python, or Scala.
  • Optimize ETL, streaming, and ML workloads for performance, cost efficiency, and reliability.
  • Administer and configure Databricks Workspaces, Unity Catalog, and cluster policies for secure, governed environments.
  • Automate infrastructure and deployments using Terraform, Git, and CI/CD pipelines.
  • Implement observability, cost optimization, and monitoring frameworks using tools like Splunk, Prometheus, or CloudWatch.
  • Collaborate with customers to build AI and LLM solutions leveraging MLflow, DBRX, and Mosaic AI.
Required Skills & Experience
  • Strong hands‑on experience with Databricks, including workspace setup, notebooks, clusters, and job orchestration.
  • Expertise in Delta Lake, DLT, Unity Catalog, and SQL Warehouses.
  • Proficiency in Python or Scala for data engineering and ML workflows.
  • Strong understanding of AWS, Azure, or GCP cloud ecosystems.
  • Experience with Terraform automation, DevOps, and MLOps practices.
  • Familiarity with monitoring and governance frameworks for large‑scale data platforms.
Nice to Have
  • Experience developing AI/LLM pipelines and RAG architectures on Databricks.
  • Exposure to Bedrock, OpenAI, or Hugging Face integrations.
  • Databricks certifications (Data Engineer, Machine Learning, or Solutions Architect) preferred.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.