Enable job alerts via email!

Resident Solutions Architect

DATABRICKS ASIAPAC UNIFIED ANALYTICS PTE. LTD.

Singapore

On-site

SGD 80,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A leading analytics firm in Singapore is seeking an experienced professional to guide customers in implementing big data projects. Responsibilities include facilitating workshops, ensuring best practices, and mentoring team members. The ideal candidate should have over 6 years of experience in Big Data technologies, with strong coding skills in Python or Scala. This is a full-time position based on-site.

Qualifications

  • 6+ years of experience in design and implementation of Big Data technologies.
  • Deep technical knowledge in coding with Python or Scala.
  • Familiarity with cloud deployment models (GCP/Azure/AWS) is a plus.

Responsibilities

  • Guide customers in implementing transformational big data projects.
  • Facilitate technical workshops and design sessions.
  • Architect, design, and document complex customer engagements.

Skills

Big Data technologies (Apache Spark)
Python
Scala
Cloud deployment models (GCP/Azure/AWS)
Job description
The impact you will have:
  • You will guide customers as they implement transformational big data projects, including end-to-end development and deployment of industry-leading big data and AI applications
  • You will assure that Databricks best practices are being used within all projects and that our quality of service and implementation is strictly followed
  • You will facilitate technical workshops, discovery and design sessions, customer requirements gathering and scoping for new and existing strategic customers
  • Assist the Professional Services leader and project managers with level of effort estimation and mitigation of risk within customer proposals and statements of work
  • Architect, design, develop, deploy, operationalize and document complex customer engagements individually or as part of an extended team as the technical lead and overall authority
  • Knowledge transfer, enablement and mentoring of other team members, customers and partners, including developing reusable project artifacts
  • Provide experience to the consulting team, provide best practices in client engagement to other teams
What we look for:
  • 6+ years of experience in Design and Implementation of Big Data technologies (Apache Spark) and familiarity with data architecture patterns (data warehouse, data lake, streaming, Lambda/Kappa architecture)
  • Deep technical knowledge and comfortable coding in Python or Scala
  • Familiarity with GCP/Azure/AWS/EC2 cloud deployment models (Public vs. VPC) is a plus
  • Experience working with teams across a diverse geographic area
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.