Enable job alerts via email!

Cloudera / Data Engineer

NCS Pte Ltd

Singapore

On-site

SGD 80,000 - 120,000

Full time

13 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

NCS Pte Ltd recherche un Cloudera / Data Engineer pour rejoindre son équipe de données. Le poste consiste à concevoir, construire et maintenir des pipelines de données évolutifs, en se concentrant sur l'écosystème Hadoop de Cloudera. Les candidats doivent avoir une expérience significative en ingénierie de données et des compétences en programmation solides, notamment en Python et Java.

Qualifications

  • 10+ ans d'expérience dans l'ingénierie de données, de préférence avec Cloudera.
  • Compétences en programmation en Python, Java et Spark.
  • Familiarité avec Linux/Unix et le script shell.

Responsibilities

  • Concevoir, construire et gérer la distribution Cloudera Hadoop.
  • Développer et maintenir des pipelines ETL.
  • Collaborer avec les équipes DevOps et Data Science.

Skills

Python
Java
Spark
Apache NiFi
ETL

Education

Bachelor's degree in computer science, Information Technology

Tools

Cloudera
Hadoop
Kafka
Hive
Spark

Job description

Company Description

NCS is a leading technology services firm that operates across the Asia Pacific region in over 20 cities, providing consulting, digital services, technology solutions, and more. We believe in harnessing the power of technology to achieve extraordinary things, creating lasting value and impact for our communities, partners, and people. Our diverse workforce of 13,000 has delivered large-scale, mission-critical, and multi-platform projects for governments and enterprises in Singapore and the APAC region.

Job Description

We are seeking a skilled Cloudera / Data Engineer to join our growing data apps team. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and platforms, with a strong focus on the Cloudera Hadoop ecosystem. You will work closely with data analysts, scientists, and business stakeholders to ensure data accessibility, quality, and security.

Key Responsibilities

• Design, build, and manage the Cloudera Hadoop Distribution (CDH/CDP)

• Develop and maintain ETL pipelines using tools such as Apache NiFi, Hive, Spark, and Impala.

• Manage and optimize HDFS, YARN, Kafka, HBase, and Oozie workflows.

• Monitor and troubleshoot cluster performance and jobs with Strong problem-solving and debugging skills

• Collaborate with DevOps and Data Science teams to integrate data platforms into applications and analytics workflows.

• Ensure data governance, security, and compliance using tools like Apache Ranger, Atlas, and Kerberos

• Mentor and guide a team of data engineers to deliver robust data solutions

Qualifications

• Bachelor's degree in computer science, Information Technology, or a related field.

• 10+ years of experience in big data engineering, preferably with Cloudera.

• Strong programming skills in Python, Java and Spark.

• Experience with Apache Spark, Hive, Impala, and Kafka.

• Familiarity with Linux/Unix and shell scripting.

Preferred Skills

• Cloudera Certified Professional (CCP) or Cloudera Data Platform certification.

• Experience/Knowledge on cloud platforms (AWS, Azure, or GCP) and hybrid deployments.

• Familiarity with CI/CD pipelines, Docker, or Kubernetes in a data context

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.