Job Search and Career Advice Platform

Enable job alerts via email!

Senior Specialist Solutions Architect

DATABRICKS ASIAPAC UNIFIED ANALYTICS PTE. LTD.

Singapore

On-site

SGD 90,000 - 120,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data analytics company in Singapore is seeking a Sr. Specialist Solutions Architect to guide customers in building big data solutions on Databricks. This role requires hands-on experience with Apache Spark and expertise in various data technologies. You will provide technical leadership on implementations, architect production-level workloads, and contribute to community adoption through mentorship and training programs. Ideal candidates will have a strong background in data management and cloud platforms, along with programming experience in Python, R, Scala, or Java.

Qualifications

  • Experience in a customer-facing technical role.
  • Deep knowledge of data management and architecture.
  • Expertise in Apache Spark and big data solutions.

Responsibilities

  • Guide customers in building big data solutions on Databricks.
  • Architect production level workloads and optimize performance.
  • Provide technical leadership and improve community adoption.

Skills

Big data technologies
Apache Spark expertise
Cloud platforms
Data management
Machine learning concepts
Python programming

Education

Degree in Computer Science or related field

Tools

Python
R
Scala
Java
Job description

As a Sr. Specialist Solutions Architect (Sr. SSA), you will guide customers in building big data solutions on Databricks that span a large variety of use cases. These are customer-facing roles, working with and supporting the Solution Architects, requiring hands‑on production experience with Apache Spark and expertise in other data technologies. SSAs help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Data Intelligence Platform. As a deep go‑to‑expert reporting to the Sr. Manager, Field Engineering (Specialists), you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty – whether that be performance tuning, machine learning, industry expertise, or more.

The impact you will have:
  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level workloads, including end-to-end pipeline load performance testing and optimisation
  • Provide technical expertise in an area such as data management, cloud platforms, data science, machine learning, or architecture
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Improve community adoption (through tutorials, training, hackathons, conference presentations)
  • Contribute to the Databricks Community
What we look for:
  • You will have experience in a customer-facing technical role with expertise in at least one of the following:
  • Software Engineer/Data Engineer: query tuning, performance tuning, troubleshooting, and debugging Spark or other big data solutions.
  • Data Scientist/ML Engineer: model selection, model lifecycle, hyper parameter tuning, model serving, deep learning.
  • Data Applications Engineer: Build use cases that use data – such as risk modelling, fraud detection, customer life‑time value.
  • Experience with design and implementation of big data technologies such as Spark/Delta, Hadoop, NoSQL, MPP, OLTP, and OLAP.
  • Maintain and extend production data systems to evolve with complex needs.
  • Production programming experience in Python, R, Scala or Java
  • Deep Specialty Expertise in at least one of the following areas:
  • Experience scaling big data workloads that are performant and cost-effective.
  • Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools and SQL Interfaces.
  • Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practices in cloud security and networking.
  • Experience with ML concepts covering Model Tracking, Model Serving and other aspects of productionizing ML pipelines in distributed data processing environments like Apache Spark, using tools like MLflow.
  • Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research)
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.