Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Michael Page

Kuala Lumpur

On-site

MYR 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading recruitment firm is looking for a Data Engineer in Kuala Lumpur. The ideal candidate will design and optimize data pipelines, build robust data solutions using tools like Databricks and Azure, and possess strong programming skills in Python or Java. With a focus on collaboration, this role will allow you to work on high-impact data projects and participate in the technological transformation of a global innovation-driven company. Join us to make a meaningful impact in the tech landscape.

Benefits

Flexible work environment
Career development opportunities
Innovative and inclusive company culture

Qualifications

  • Minimum 4 years of experience in data engineering or related fields.
  • Strong programming skills in Python, Scala, or Java.
  • Hands-on experience with big data tools (Databricks, ADF, Spark, PySpark).

Responsibilities

  • Design and optimize scalable data pipelines for analytics.
  • Collaborate with Data Science, Analytics, and engineering teams.
  • Implement data lineage, governance, and quality standards.

Skills

Data pipeline optimization
Python programming
Big data tools (Databricks, Spark)
Cloud platforms (Azure, AWS, GCP)
Collaboration with cross-functional teams
Microservices architecture

Tools

Databricks
Azure Data Factory
MLFlow
Docker
Kubernetes
Job description

As a Data Engineer, you will design, build, and optimize scalable data pipelines that power real‑time analytics, machine learning, and enterprise decision making.

In this role, you will:

  • Build robust pipelines using Databricks, Spark, PySpark, Azure Data Factory, Python & SQL.
  • Work with diverse datasets from APIs, applications, cloud data warehouses (Snowflake, Synapse, Redshift, BigQuery) and more.
  • Apply modern data architecture patterns like microservices, event‑driven design, and data lake frameworks.
  • Collaborate with Data Science, Analytics, and cross‑functional engineering teams.
  • Implement data lineage, governance, observability, and quality standards.
  • Support MLOps initiatives with tools such as MLFlow, Docker, Kubernetes, and Azure Functions.
  • Drive innovation through POCs and exploration of new technologies.
  • Develop infrastructure using IaC and CI/CD pipelines.

As a Data Engineer, you will design, build, and optimize scalable data pipelines that power real‑time analytics, machine learning, and enterprise decision making.

In this role, you will:

  • Build robust pipelines using Databricks, Spark, PySpark, Azure Data Factory, Python & SQL.
  • Work with diverse datasets from APIs, applications, cloud data warehouses (Snowflake, Synapse, Redshift, BigQuery) and more.
  • Apply modern data architecture patterns like microservices, event‑driven design, and data lake frameworks.
  • Collaborate with Data Science, Analytics, and cross‑functional engineering teams.
  • Implement data lineage, governance, observability, and quality standards.
  • Support MLOps initiatives with tools such as MLFlow, Docker, Kubernetes, and Azure Functions.
  • Drive innovation through POCs and exploration of new technologies.
  • Develop infrastructure using IaC and CI/CD pipelines.
Environment that values innovative modern engineering practices.Be part of a fast‑evolving data transformation journey

A successful Data Engineer will bring:

  • Minimum 4 years of experience in data engineering or related fields.
  • Strong programming skills in Python, Scala, or Java, with experience in Spark frameworks.
  • Hands‑on experience with big data tools (Databricks, ADF, Spark, PySpark).
  • Knowledge of cloud platforms such as Azure, AWS, or GCP.
  • Experience with relational or NoSQL databases and complex query building.
  • Familiarity with distributed systems, cloud data warehouses, and modern data ecosystems.
  • Ability to work with cross‑functional teams and communicate technical concepts clearly.
  • Curiosity for exploring new technologies and building scalable data solutions.

Our client is a global, innovation‑driven technology company known for building intelligent digital solutions used worldwide. With a strong focus on transformation, data modernization, and future‑ready platforms, they empower their teams to experiment with new tools, influence enterprise strategy, and create meaningful business impact.

Expect an environment that values learning, collaboration, and modern engineering practices - without the bureaucracy.

  • Work on high‑impact, enterprise‑scale data projects with global stakeholders.
  • Exposure to modern cloud, big data, and ML/Ops technologies.
  • A culture that encourages innovation, experimentation, and continuous learning.
  • Flexible work environment with international collaboration.
  • Career development in a global tech landscape.
  • Supportive, inclusive team culture with engagement activities.

If you're excited about building modern data pipelines, shaping scalable platforms, and growing your career in a global tech environment, we'd love to hear from you.Apply now and be part of something impactful!

Be careful - Don’t provide your bank or credit card details when applying for jobs. Don't transfer any money or complete suspicious online surveys. If you see something suspicious, report this job ad .

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.