Job Search and Career Advice Platform

Enable job alerts via email!

Hadoop Developer Group IT

eTeam

Kuala Lumpur

On-site

MYR 100,000 - 140,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A data solutions company based in Kuala Lumpur is seeking an experienced professional to support the development and maintenance of Hadoop solutions. The role involves developing ETL/ELT jobs, providing production support, and ensuring data quality. The ideal candidate should have a Bachelor's degree in IT, at least 8 years of experience in data-related work, and strong skills in Hadoop ecosystems. Certifications in Hadoop development are preferred.

Qualifications

  • 8+ years of experience in data-related work in Warehouse/Data Marts.
  • 2+ years of experience with Hadoop/ETL.
  • Ability to plan and organize technical work.

Responsibilities

  • Develop ETL/ELT jobs from source to Data lake platform.
  • Ensure development standards are followed.
  • Provide operational support after production deployment.

Skills

Scripting skills in Linux environment
SQL
Interpersonal skills
Problem-solving skills
Hadoop expertise

Education

Bachelor's degree in IT

Tools

Hadoop
Sqoop
Hive
HBase
Spark
Oozie
Python
Scala
Job description
Job Purpose

To support development and maintenance of Hadoop solutions for the Enterprise. To be part of initiatives that brings data into the data lake and delivers insights. To perform production support and maintenance of existing datasets in Hadoop.

The Job
  • Develop ETL/ELT jobs based on requirements from source to Data lake platform.
  • Ensure that all development standards are being followed.
  • Perform code review functions for applications programs developed by team members.
  • Provide production and operational support after production deployment.
  • Monitor and manage production jobs to verify execution and measure performance to assure ongoing data quality and optimization of the system to manage scalability and performance and identify improvement opportunities for key ETL processes.
  • Work effectively with all technical personnel and clearly translate business priorities and objectives into technical solutions.
Requirements
  • A Bachelor's degree in IT.
  • Min 8 years' experience in data related work in Warehouse / Data Marts, with at least 2+ years of experience as Hadoop/ETL experience.
  • Ability to plan and organize technical work and deliverables.
  • Self-motivated and independent. Able to work with minimum supervision and to work well with stakeholders and project staff.
  • Ability to prioritize and multi-task across numerous work streams.
  • Strong interpersonal skills; ability to work on cross-functional teams. Strong verbal and written communication skills.
  • Deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise wide data architectures and data warehousing/BI.
  • Demonstrated problem-solving skills. Ability to learn effectively and meet deadlines.
  • Strong scripting skills in Linux environment and SQL.
  • Expertise in Hadoop ecosystems HDFS (Hortonworks).
  • Hands-on Experience in Sqoop, Hive, HBase, Spark, Oozie, Python, Scala is a must.
License Certifications
  • Hadoop Certification
  • Hadoop Developer
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.