Enable job alerts via email!

Cloudera / Big Data Engineer

NCS PTE. LTD.

Singapore

On-site

SGD 80,000 - 120,000

Full time

14 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

NCS PTE. LTD., a leading technology services firm, seeks a Cloudera / Data Engineer. The role involves designing scalable data pipelines focusing on the Cloudera Hadoop ecosystem and requires a minimum of 5 years experience in big data engineering. Ideal candidates should possess skills in Python, Java, and Spark, along with a degree in computer science or a related field. Join a team focused on impactful technology solutions across the Asia Pacific region.

Qualifications

  • 5+ years of experience in big data engineering, preferably with Cloudera.
  • Strong programming skills in Python, Java, and Spark.
  • Experience with ETL pipelines and tools.

Responsibilities

  • Design, build, and manage the Cloudera Hadoop Distribution.
  • Develop and maintain ETL pipelines using Apache NiFi, Hive, Spark, and Impala.
  • Monitor and troubleshoot cluster performance.

Skills

Python
Java
Spark
Linux/Unix

Education

Degree in Computer Science
Degree in Information Technology

Tools

Apache NiFi
Hive
Spark
Kafka

Job description

NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities, providing consulting, digital services, technology solutions, and more. We believe in harnessing the power of technology to achieve extraordinary things, creating lasting value and impact for our communities, partners, and people. Our diverse workforce of 13,000 has delivered large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region.

As a Cloudera / Data Engineer, you will be responsible for designing, building, and maintaining scalable data pipelines and platforms, with a strong focus on the Cloudera Hadoop ecosystem. You will work closely with data analysts, scientists, and business stakeholders to ensure data accessibility, quality, and security.

What will you do?

  • Design, build, and manage the Cloudera Hadoop Distribution (CDH/CDP).
  • Develop and maintain ETL pipelines using tools such as Apache NiFi, Hive, Spark, and Impala.
  • Manage and optimize HDFS, YARN, Kafka, HBase, and Oozie workflows.
  • Monitor and troubleshoot cluster performance and jobs with strong problem-solving and debugging skills.
  • Collaborate with DevOps and Data Science teams to integrate data platforms into applications and analytics workflows.
  • Ensure data governance, security, and compliance using tools like Apache Ranger, Atlas, and Kerberos.
  • Mentor and guide a team of data engineers to deliver robust data solutions.

The ideal candidate should possess:

  • 5+ years of experience in big data engineering, preferably with Cloudera.
  • Strong programming skills in Python, Java, and Spark.
  • Experience with Apache Spark, Hive, Impala, and Kafka.
  • Familiarity with Linux/Unix and shell scripting.
  • Degree in Computer Science, Information Technology, or a related field.

Preferred Skills:

  • Cloudera Certified Professional (CCP) or Cloudera Data Platform certification.
  • Experience/Knowledge on cloud platforms (AWS, Azure, or GCP) and hybrid deployments.
  • Familiarity with CI/CD pipelines, Docker, or Kubernetes in a data context.

We are driven by our AEIOU beliefs—Adventure, Excellence, Integrity, Ownership, and Unity—and we seek individuals who embody these values in both their professional and personal lives. We are committed to our Impact: Valuing our clients, Growing our people, and Creating our future.

Together, we make the extraordinary happen.

Learn more about us at ncs.co and visit our LinkedIn career site.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.