Job Search and Career Advice Platform

Enable job alerts via email!

Senior Specialist Solutions Architect

DATABRICKS ASIAPAC UNIFIED ANALYTICS PTE. LTD.

Singapore

On-site

SGD 100,000 - 125,000

Full time

4 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global leader in analytics solutions in Singapore is looking for a Sr. Specialist Solutions Architect to guide customers in building big data solutions. In this customer-facing role, you will provide technical leadership on various data technologies, optimize production workloads, and contribute to the Databricks Community. Candidates should have experience in data engineering, machine learning, and programming with Apache Spark, Python, or Java. You will also collaborate closely with Solution Architects on advanced technical sales and education initiatives.

Qualifications

  • Experience in a customer-facing technical role.
  • Production programming experience in languages such as Python, R, Scala, or Java.
  • Experience with big data technologies like Spark/Delta, Hadoop, NoSQL.

Responsibilities

  • Guide customers in building big data solutions.
  • Architect production level workloads and optimize performance.
  • Provide technical expertise in data management, cloud platforms, and architecture.

Skills

Apache Spark™ expertise
Data Engineering
Performance tuning
Machine Learning
Data Management
Cloud Platforms

Education

Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research)

Tools

Python
R
Scala
Java
AWS
Azure
GCP
Job description

As a Sr. Specialist Solutions Architect (Sr. SSA), you will guide customers in building big data solutions on Databricks that span a large variety of use cases. These are customer-facing roles, working with and supporting the Solution Architects, requiring hands-on production experience with Apache Spark™ and expertise in other data technologies. SSAs help customers through design and successful implementation of essential workloads while aligning their technical roadmap for expanding the usage of the Databricks Data Intelligence Platform. As a deep go-to-expert reporting to the Sr. Manager, Field Engineering (Specialists), you will continue to strengthen your technical skills through mentorship, learning, and internal training programs and establish yourself in an area of specialty - whether that be performance tuning, machine learning, industry expertise, or more.

The impact you will have:
  • Provide technical leadership to guide strategic customers to successful implementations on big data projects, ranging from architectural design to data engineering to model deployment
  • Architect production level workloads, including end-to-end pipeline load performance testing and optimisation
  • Provide technical expertise in an area such as data management, cloud platforms, data science, machine learning, or architecture
  • Assist Solution Architects with more advanced aspects of the technical sale including custom proof of concept content, estimating workload sizing, and custom architectures
  • Improve community adoption (through tutorials, training, hackathons, conference presentations
  • Contribute to the Databricks Community
What we look for:
  • You will have experience in a customer-facing technical role with expertise in at least one of the following:
  • Software Engineer/Data Engineer: query tuning, performance tuning, troubleshooting, and debugging Spark or other big data solutions.
  • Data Scientist/ML Engineer: model selection, model lifecycle, hyper parameter tuning, model serving, deep learning.
  • Data Applications Engineer: Build use cases that use data - such as risk modelling, fraud detection, customer life‑time value.
  • Experience with design and implementation of big data technologies such as Spark/Delta, Hadoop, NoSQL, MPP, OLTP, and OLAP.
  • Maintain and extend production data systems to evolve with complex needs.
  • Production programming experience in Python, R, Scala or Java
  • Deep Specialty Expertise in at least one of the following areas:
  • Experience scaling big data workloads that are performant and cost-effective.
  • Experience with Development Tools for CI/CD, Unit and Integration testing, Automation and Orchestration, REST API, BI tools and SQL Interfaces.
  • Experience designing data solutions on cloud infrastructure and services, such as AWS, Azure, or GCP using best practises in cloud security and networking.
  • Experience with ML concepts covering Model Tracking, Model Serving and other aspects of productionizing ML pipelines in distributed data processing environments like Apache Spark, using tools such as MLflow.
  • Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research)
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.