Enable job alerts via email!

Senior Big Data Platform Engineer

TEAMLEASE DIGITAL CONSULTING PTE. LTD.

Singapore

On-site

SGD 100,000 - 140,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company in digital consulting is seeking a highly skilled Senior Big Data Platform Engineer. The ideal candidate will have over 9 years of specialized experience in managing Big Data ecosystems, especially within the banking sector. Key responsibilities include maintaining Big Data platforms, automating configurations, and implementing security protocols. If you possess strong skills in tools like Hadoop and Kafka along with expertise in cloud platforms, we invite you to apply.

Qualifications

  • 9+ years of experience in Big Data technologies.
  • Proven experience in Hadoop and Kafka administration.
  • Strong knowledge of cloud platforms and security protocols.

Responsibilities

  • Maintain and administer Big Data platforms across multiple environments.
  • Automate deployment and configuration using Ansible.
  • Implement security protocols including Kerberos and TLS/SSL.

Skills

Big Data technologies
Ansible
Security integration
Communication

Education

Bachelor's degree in Science, Engineering, Technology

Tools

Hadoop
Kafka
AWS
Azure
ELK
Grafana

Job description

About the Role:

We are seeking a highly skilled Senior Big Data Platform Engineer with strong IT experience, including 9+ years of specialisation in Big Data technologies and platform engineering. The ideal candidate will possess in-depth expertise in managing and administering large-scale Big Data ecosystems, particularly within the banking and financial sectors.

Key Responsibilities:

  • Maintain and administer Big Data platforms (Hadoop, Confluent Kafka, Spark, Hive, HBase, Impala, Druid, etc.) across multiple environments (DEV, UAT, QA, PROD).
  • Automate deployment and configuration using Ansible and other scripting tools.
  • Provide platform support, performance tuning, and capacity planning for Kafka and Hadoop clusters.
  • Implement security protocols including Kerberos, TLS/SSL, and Sentry.
  • Work closely with engineering and infrastructure teams to design scalable, secure Big Data solutions.
  • Manage OS patching, cluster upgrades, and disaster recovery procedures.
  • Supervise scheduled workloads (e.g., via Airflow or Control-M).

Requirements:

  • Bachelor's degree in Science, Engineering, Technology

Proven experience with:

  • Hadoop ecosystem (HDFS, YARN, Hive, HBase, Spark, Sqoop, Oozie, etc.)
  • Confluent Kafka & Kafka cluster administration
  • Security integration (Kerberos, Sentry, TLS/SSL)
  • Cloud platforms: AWS, Azure
  • Monitoring tools: ELK, Grafana
  • Strong knowledge of Ansible and scripting (Shell)
  • Excellent communication, troubleshooting, and team collaboration skills.

Preferred Qualifications:

  • Hands-on experience working in banking domain
  • Familiarity with Druid, Alluxio, Cloudera Director, and CDSW
  • Experience in hybrid cloud deployments and on-premise cluster setups
  • Familiarity with DevOps principles for Big Data environments
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.