Enable job alerts via email!

DevOps Engineer (Big Data / Hadoop) – Contract

NTT SINGAPORE PTE. LTD.

Singapore

On-site

SGD 80,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A leading financial services group in Singapore is seeking a skilled DevOps Engineer to manage deployments and ensure large-scale systems run smoothly. The role requires 4–7 years of experience, strong skills in Linux and scripting, and familiarity with CI/CD tools and big data technologies. This position offers a contract for 12 months, with the possibility of renewal.

Qualifications

  • 4–7 years of DevOps/system engineering experience.
  • Strong hands-on experience with Linux administration (RHEL preferred).
  • Experience with CI/CD tools is essential.

Responsibilities

  • Design, implement, and maintain CI/CD pipelines.
  • Troubleshoot complex deployment issues and performance bottlenecks.
  • Develop automation scripts to streamline tasks.
  • Administer Linux environments and databases.

Skills

Linux administration
Shell scripting (Bash)
Python
CI/CD tools
Big data technologies
Troubleshooting skills
Communication skills

Education

Bachelor’s degree in Computer Science or equivalent

Tools

Jenkins
GitLab CI/CD
Hadoop
Cloudera CDP
Dataiku DSS
AWS
Job description

DevOps Engineer (Big Data / Hadoop) – Contract

Employment Type: Contract (12 months, renewable)


Location: Central, Singapore

Experience Required: 4–7 years


Salary Range: Based on Experience

Interested candidates are kindly requested to email their CV with their experience to sandeep.sringeripai@global.nt

Our client is a leading financial services and insurance group headquartered in Singapore with a strong regional presence across Asia.

We are looking for a skilled DevOps Engineer to support a financial services project. The engineer will be responsible for managing and automating deployments, troubleshooting issues across big data and CI/CD environments, and ensuring smooth operations of large-scale systems.

Key Responsibilities:

  • Design, implement, and maintain CI/CD pipelines for software build, test, and deployment.
  • Troubleshoot and resolve complex deployment issues, performance bottlenecks, and system failures.
  • Develop and maintain automation scripts in Bash and Python to streamline system administration tasks.
  • Collaborate with data engineers, scientists, and development teams to optimize workflows.
  • Manage and administer Linux environments (RHEL, Ubuntu) in both on-premise and data center setups.
  • Handle database administration for Hadoop (CDP), MySQL, SQL.
  • Support security configurations including Kerberos, LDAP, TLS, Ranger, SSSD.
  • Deploy and administer Dataiku DSS and Project Deployer.
  • Document processes, configurations, and troubleshooting steps.
  • Provide technical guidance and production support.
Requirements
  • Bachelor’s degree in Computer Science, IT, or equivalent experience.
  • 4–7 years of DevOps / system engineering experience.
  • Strong hands-on experience with Linux administration (RHEL preferred).
  • Proficiency in shell scripting (Bash) and Python.
  • Experience with CI/CD tools (Jenkins, GitLab CI/CD).
  • Knowledge of big data technologies (Hadoop, Cloudera CDP).
  • Familiarity with Dataiku deployment & administration is highly desirable.
  • Strong troubleshooting and problem-solving skills.
  • Good communication and stakeholder collaboration skills.
  • Exposure to cloud platforms (AWS) is a plus.

Interested candidates are kindly requested to email their CV with their experience to sandeep.sringeripai@global.nt

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.