Enable job alerts via email!

Big Data Engineer (Work from Home)

LHK Century

Kuala Lumpur

Remote

MYR 100,000 - 150,000

Full time

2 days ago
Be an early applicant

Job summary

A leading tech company in Kuala Lumpur is seeking an IT Operation Executive to design and optimize big data systems. The ideal candidate has strong skills in Hadoop ecosystem tools, big data modeling, and is fluent in Mandarin. This is a great opportunity to work in a dynamic environment focused on technology. Join us to support seamless operations and impact millions of users.

Qualifications

  • Proficient in building and optimizing big data platforms using Hadoop ecosystem tools.
  • Strong grasp of big data modeling methodologies and techniques.
  • Fluent in Mandarin to liaise with Mandarin-speaking associates.

Responsibilities

  • Design and develop core components of the big data platform.
  • Participate in data middleware construction and component upgrades.
  • Research big data technologies to optimize architecture.

Skills

Big data platform principles
Java/Scala programming
Data modeling methodologies
Hands-on experience with ELK

Tools

Hadoop ecosystem tools
ClickHouse
Spark
Flume
Kafka
HBase
Hive
ZooKeeper
Job description
Overview

Are you a dedicated IT Operation Executive with a passion for maintaining and optimizing IT systems? Do you thrive in a fast-paced, dynamic environment where your expertise ensures seamless operations and supports millions of users? If so, we want YOU on our team!

It will be a considerable amount of time post-writing. If the preceding content aligns with your preferences, promptly click the "Quick apply" button! Missing out on such a remarkable and stable company would be regrettable, and not attempting it would be a lost opportunity. Take the chance and seize it!Hurry Up….Join our team!!

Responsibilities
  1. Design, develop, and tackle technical challenges for core components of the big data platform;
  2. Participate in data middleware construction, including the development and upgrades of components such as data integration, metadata management, and task management;
  3. Research big data technologies (e.g., ELK, Flink, Spark, ClickHouse) to optimize cluster architecture, troubleshoot issues, and resolve performance bottlenecks.
Requirements
  1. Proficient in big data platform principles, with expertise in building and optimizing platforms using Hadoop ecosystem tools (e.g., Spark, Impala, Flume, Kafka, HBase, Hive, ZooKeeper) and ClickHouse;
  2. Hands-on experience in developing and deploying ELK-based big data analytics platforms;
  3. Strong grasp of big data modeling methodologies and techniques;
  4. Skilled in Java/Scala programming, design patterns, and big data technologies (Spark, Flink);
  5. Experience in large-scale data warehouse architecture/model/ETL design, with capabilities in massive data processing and performance tuning;
  6. Extensive database design/development experience, familiar with relational databases and NoSQL.
  7. Fluent in Mandarinin order to liaise withMandarinspeaking associates
Unlock job insights

Salary match Number of applicants Skills match

Your application will include the following questions:

  • How would you rate your Mandarin language skills?
  • Which of the following statements best describes your right to work in Malaysia?
  • What\'s your expected monthly basic salary?
  • Which of the following types of qualifications do you have?
  • How many years\' experience do you have as a Big Data Engineer?
  • Which of the following data analytics tools are you experienced with?
  • Which of the following programming languages are you experienced in?
  • Which of the following languages are you fluent in?

Researching careers? Find all the information and tips you need on career advice.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.