Job Search and Career Advice Platform

Enable job alerts via email!

Solutions Architect, Singapore

Cloudera

Remote

SGD 120,000 - 160,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data management firm is recruiting a Solutions Architect in Singapore to develop scalable Big Data solutions. The role involves working closely with clients, implementing the Cloudera Data Platform, and optimizing performance in various customer environments. Candidates should possess over 15 years of IT experience and expertise in Hadoop and related technologies. Enjoy a generous PTO policy and career development support in a flexible work environment.

Benefits

Generous PTO Policy
Flexible WFH Policy
Comprehensive Benefits

Qualifications

  • 15+ years in Information Technology and System Architecture experience.
  • 8+ years of Professional Services experience architecting large scale solutions.
  • 10+ years designing and deploying large-scale Hadoop solutions.

Responsibilities

  • Implement Big Data solutions using Cloudera Data Platform.
  • Design and implement platform architectures for customers.
  • Analyze complex distributed production deployments.

Skills

Customer requirements translation
Hadoop expertise
Big data use cases
Security configurations (LDAP / AD, Kerberos)
Data transformation solutions
Enterprise Linux environment

Tools

Apache Hive
NiFi
Bash shell scripts
Python
Ansible
Job description
Job Description

At Cloudera, we empower people to transform complex data into clear and actionable insights. With as much data under management as the hyperscalers, we're the preferred data partner for the top companies in almost every industry. Powered by the relentless innovation of the open source community, Cloudera advances digital transformation for the world’s largest enterprises.

Cloudera is seeking a Solutions Architect to join its APAC Professional Services team in Singapore. In this role you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark and related Big Data technology. This role presents a client facing opportunity that combines consulting skills with deep technical design and development in the Big Data space. This role will present the successful candidate the opportunity to travel across Asia Pacific and across multiple industries and large customer organizations.

As a Solutions Architect you will
  • Work directly with customers to implement Big Data solutions at scale using the Cloudera Data Platform and Cloudera Dataflow
  • Design and implement Hadoop and NiFi platform architectures and configurations for customers
  • Perform platform installation and upgrades for advanced secured cluster configurations
  • Analyze complex distributed production deployments, and make recommendations to optimize performance
  • Able to document and present complex architectures for the customers technical teams
  • Work closely with Cloudera’ teams at all levels to help ensure the success of project consulting engagements with customer
  • Drive projects with customers to successful completion
  • Write and produce technical documentation, blogs and knowledge-base articles
  • Participate in the pre-and post- sales process, helping both the sales and product teams to interpret customers’ requirements
  • Keep current with the Hadoop Big Data ecosystem technologies
  • Attend speaking engagements when needed
  • Travel up to 75%
We’re excited about you if you have
  • 15+ years in Information Technology and System Architecture experience
  • 8+ years of Professional Services (customer facing) experience architecting large scale storage, data centre and / or globally distributed solutions
  • 10+ years designing and deploying 3 tier architectures or large-scale Hadoop solutions
  • Expert on big data use-cases and provided recommend standard design patterns commonly used in Hadoop-based and streaming data deployments.
  • In-depth knowledge of the data management eco-system including : Concepts of data warehousing, ETL, data integration, etc.
  • Expert at understanding and translating customer requirements into technical requirements
  • Several experiences implementing data transformation and processing solutions
  • Several experiences designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Several experiences setting up multi-node Hadoop clusters
  • Several experiences in configuring security configurations (LDAP / AD, Kerberos / SPNEGO)
  • Several experiences in Cloudera Software and / or HDP Certification (HDPCA / HDPCD) is a plus
  • Strong experience implementing software and / or solutions in the enterprise Linux environment
  • Strong understanding with various enterprise security solutions such as LDAP and / or Kerberos
  • Several experiences with migration of data platforms to Datalakehouse.
You may also have
  • Strong understanding of network configuration, devices, protocols, speeds and optimisations
  • Strong understanding of the Java ecosystem including debugging, logging, monitoring and profiling tools
  • St with scripting tools such as bash shell scripts, Python and / or Perl, Ansible, Chef, Puppet
  • Experience in architecting data center solutions – properly selecting server and storage hardware based on performance, availability and ROI requirements
What you can expect from us
  • Generous PTO Policy
  • Support work life balance with
  • Flexible WFH Policy
  • Mental & Physical Wellness programs
  • Phone and Internet Reimbursement program
  • Access to Continued Career Development
  • Comprehensive Benefits and Competitive Packages
  • Employee Resource Groups
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.