Enable job alerts via email!

Data Engineer

TIME dotCom Berhad

Shah Alam

On-site

MYR 70,000 - 110,000

Full time

30 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in Malaysia seeks a skilled Data Engineer to join their dynamic team. In this role, you'll design, build, and maintain robust data pipelines on Google Cloud Platform. Collaborating with analytics and business teams, your efforts will drive data-driven decisions, ensuring performance and scalability.

Qualifications

  • Strong experience with Python for data engineering tasks.
  • Hands-on experience with DBT, Airflow, and BigQuery.
  • Proficiency in deploying infrastructure using Terraform.

Responsibilities

  • Build and maintain ELT pipelines using Python, Airflow, and DBT.
  • Optimize existing pipelines for efficiency and scalability.
  • Ensure reliable data workflows with CI/CD pipelines.

Skills

Python
DBT
Airflow
BigQuery
Terraform
CI/CD

Job description

This position reports to the Chief Data Officer

About the Role:

We are looking for a skilled Data Engineer to join our growing data team. You’ll be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines on Google Cloud Platform (GCP). You’ll work closely with analytics, product, and business teams to enable data-driven decisions at scale.

We have three vacancies: a permanent role and 2 one-year contract position renewable annually.

Key Responsibilities:

  • Build and maintain ELT pipelines using Python, Airflow, and DBT
  • Model data using best practices for performance and maintainability in BigQuery
  • Implement infrastructure as code using Terraform
  • Ensure reliable data workflows with CI/CD pipelines and automated testing
  • Collaborate with data analysts, analytics engineers, and business stakeholders to meet data needs
  • Optimize existing pipelines for efficiency and scalability

Requirements:

  • Strong experience with Python for data engineering tasks
  • Hands-on experience with DBT, Airflow, and BigQuery
  • Proficiency in deploying infrastructure using Terraform
  • Familiarity with CI/CD tools and best practices in DevOps
  • Experience in cloud environments, preferably Google Cloud
  • Strong understanding of data warehousing concepts and performance tuning

*Only shortlisted candidates will be notified

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.