Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Elitez Pte Ltd

Malaysia

On-site

MYR 100,000 - 150,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A company in Malaysia is seeking a Data Engineer to design and maintain data pipelines using GCP services such as BigQuery and Dataflow. Candidates should have over 8 years of experience in data engineering, with strong skills in SQL and Python. The role involves data modeling, governance, and collaboration with BI teams to support analytics. Ideal applicants will also have familiarity with orchestration workflows and exposure to Azure data tools.

Benefits

13th Month Bonus
Staff Insurance Provided

Qualifications

  • 8 years of hands-on experience in data engineering or related roles.
  • Understanding of relational, dimensional, and modern data modeling concepts.
  • Exposure to Azure Data Stack is a plus.

Responsibilities

  • Design, build, and maintain batch and streaming data pipelines using GCP services.
  • Implement and optimize data models in BigQuery for analytics and reporting.
  • Monitor and troubleshoot data pipelines for reliability and efficiency.
  • Work with analysts and BI developers to enable reporting and analytics.

Skills

Strong SQL
Python
GCP Data Stack
Orchestration Workflows (Airflow/Composer)

Tools

BigQuery
Dataflow
Dataproc
Dataform
Oracle
MicroStrategy
Job description

13th Month Bonus

Staff Insurance Provided

Key Responsibilities:

Pipeline Development: Design, build, and maintain batch and streaming data pipelines using GCP services such as BigQuery, Dataflow, Dataproc, Composer, Dataform, and Cloud Functions.

Data Modeling & Optimization: Implement and optimize data models in BigQuery to support analytics, BI reporting, and machine learning workloads.

Data Integration: Connect and transform data from multiple sources, including APIs, databases, event streams, and flat files.

Platform Reliability: Monitor and troubleshoot data pipelines, ensuring high availability, scalability, and cost efficiency.

Governance & Quality: Implement data validation, quality checks, and security best practices to ensure trusted data.

Collaboration: Work closely with analysts, BI developers (Tableau, MicroStrategy), and business teams to enable reporting and self-service analytics.

Legacy Support (Light): Provide occasional support for legacy systems (Oracle, MicroStrategy) where needed, focusing on data extraction and gradual modernization.

Key Requirements:

Experience: 8 years of hands‑on experience in data engineering, ETL/ELT development, or related roles.

  • Strong SQL and Python skills for data transformation and automation.
  • Hands‑on experience with GCP Data Stack (BigQuery, Dataflow, Composer, Dataproc, Dataform).
  • Familiarity with orchestration workflows (Airflow/Composer) and CI/CD for data pipelines.

Data Modeling: Understanding of relational, dimensional, and modern data modeling concepts, with an eye for performance optimization.

Cloud Knowledge: Exposure to Azure Data Stack (Synapse, Data Factory, Databricks) is a plus.

Your application will include the following questions:

  • Which of the following statements best describes your right to work in Malaysia?
  • What's your expected monthly basic salary?
  • Which of the following types of qualifications do you have?
  • How many years' experience do you have as a Data Engineer?
  • How many years' experience do you have using SQL queries?
  • Which of the following programming languages are you experienced in?
  • Which of the following data analytics tools are you experienced with?

KINESSO MALAYSIA SDN. BHD. (fka Mediabrands Global Technology Solutions Sdn Bhd)

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.