Enable job alerts via email!

Cloudera / Data Engineer

NCS Hong Kong and Singapore

Singapore

On-site

SGD 100,000 - 130,000

Full time

Today
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

NCS Hong Kong and Singapore is seeking a Cloudera/Data Engineer to design and maintain scalable data pipelines within the Cloudera Hadoop ecosystem. The successful candidate will work collaboratively with data analysts and DevOps teams, ensure data accessibility and security, and mentor junior engineers. A strong background in big data technology and cloud platforms is required for this role.

Qualifications

  • 10+ years of experience in big data engineering, preferably with Cloudera.
  • Strong programming skills in Python, Java, and Spark.
  • Experience with Apache Spark, Hive, Impala, and Kafka.

Responsibilities

  • Design, build, and manage the Cloudera Hadoop Distribution.
  • Develop and maintain ETL pipelines using Apache NiFi, Hive, Spark.
  • Monitor and troubleshoot cluster performance.

Skills

Python
Java
Spark
ETL

Education

Degree in Computer Science
Information Technology

Tools

Apache NiFi
Hive
Spark
Kafka
Cloudera

Job description

Company Description

NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities, providing consulting, digital services, technology solutions, and more. We believe in harnessing the power of technology to achieve extraordinary things, creating lasting value and impact for our communities, partners, and people. Our diverse workforce of 13,000 has delivered large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region.

Job Description

Cloudera / Data Engineer

As aCloudera / Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms, with a strong focus on the Cloudera Hadoop ecosystem. You will work closely with data analysts, scientists, and business stakeholders to ensure data accessibility, quality, and security.

What will you do?

  • Design, build, and manage the Cloudera Hadoop Distribution (CDH/CDP).
  • Develop and maintain ETL pipelines using tools such as Apache NiFi, Hive, Spark, and Impala.
  • Manage and optimize HDFS, YARN, Kafka, HBase, and Oozie workflows.
  • Monitor and troubleshoot cluster performance and jobs with strong problem-solving and debugging skills.
  • Collaborate with DevOps and Data Science teams to integrate data platforms into applications and analytics workflows.
  • Ensure data governance, security, and compliance using tools like Apache Ranger, Atlas, and Kerberos.
  • Mentor and guide a team of data engineers to deliver robust data solutions.

Qualifications

The ideal candidate should possess:

  • 10+ years of experience in big data engineering, preferably with Cloudera.
  • Strong programming skills in Python, Java, and Spark.
  • Experience with Apache Spark, Hive, Impala, and Kafka.
  • Familiarity with Linux/Unix and shell scripting.
  • Degree in Computer Science, Information Technology, or a related field.

Preferred Skills:

  • Cloudera Certified Professional (CCP) or Cloudera Data Platform certification.
  • Experience/Knowledge on cloud platforms (AWS, Azure, or GCP) and hybrid deployments.
  • Familiarity with CI/CD pipelines, Docker, or Kubernetes in a data context.

Additional Information

We are driven by ourAEIOU beliefs - Adventure, Excellence, Integrity, Ownership, and Unity -and we seek individuals who embody these values in both their professional and personal lives. We arecommitted to our Impact: Valuing our clients, Growing our people, and Creating our future.

Together, wemake the extraordinary happen.

Learn more about us at ncs.co and visit our LinkedIn career site.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.