Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer (AWS/Azure)

Manila, Philippines

Kuala Lumpur

On-site

MYR 80,000 - 120,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology consulting firm in Kuala Lumpur is seeking a Data Engineer to join a digital transformation program. The role focuses on developing scalable data pipelines using AWS and Azure, integrating master data to support analytics. Ideal candidates will have 3-7 years of experience, strong skills in Python and SQL, and familiarity with modern orchestration tools like Airflow. This position offers a competitive salary and an engaging opportunity to work in a collaborative environment.

Qualifications

  • 3-7 years of hands-on experience in data engineering or related technical roles.
  • Strong proficiency in Python and SQL, with experience in Airflow and Spark.
  • Working knowledge of Azure data ecosystem (Data Factory, AzureML, Synapse).

Responsibilities

  • Design, develop, and maintain data pipelines using Airflow and modern orchestration tools.
  • Collaborate with teams to integrate AI/ML models into production environments.
  • Develop ETL/ELT workflows to ingest and transform data from multiple systems.

Skills

Data pipeline development (Airflow, Azure Data Factory, Spark)
SQL and Python programming
AWS and/or Azure data platforms
ETL/ELT design, data modeling, data governance
API integrations and modular pipeline design

Education

Bachelor's degree in Computer Science, Information Technology, or Data Engineering
Certifications in Azure Data Engineering, AWS Data Analytics, or GCP Data Engineering

Tools

Airflow
Spark
Azure Data Factory
AWS tools (S3, ECS, RDS, EC2)
Data governance frameworks
Job description
Data Engineer (AWS/Azure)

Thakral One Pte Ltd – Kuala Lumpur, Malaysia

Posted 11 hours ago – Permanent – Competitive

The Opportunity

We are seeking Data Engineers to join a major digital transformation program led by a global consulting client. This role will focus on building scalable, efficient, and production‑grade data pipelines within the client’s AWS and Azure data environments, integrating master data across complex systems to support analytics, AI/ML use cases, and operational decision‑making.

The ideal candidate is detail‑oriented, collaborative, and able to design robust ETL/ELT workflows in a high‑impact enterprise environment.

The Role
  • Design, develop, and maintain data pipelines using Airflow and modern orchestration tools.
  • Build modular, incremental data flows to support efficient refresh cycles and minimize redundant processing.
  • Collaborate with cross‑functional teams to integrate AI/ML models into production‑grade environments.
  • Develop ETL/ELT workflows to ingest and transform data from multiple telecom and operational systems.
  • Implement data quality, version control, and governance frameworks with reusable, documented components.
  • Conduct troubleshooting, optimization, and performance tuning of data processes to ensure reliability and scalability.
  • Partner with business and analytics teams to validate domain logic and ensure accurate data interpretation.
The Expertise
Education & Certifications
  • Bachelor's degree in Computer Science, Information Technology, or Data Engineering.
  • Certifications in Azure Data Engineering, AWS Data Analytics, or GCP Data Engineering are an advantage.
Experience & Background
  • 3-7 years of hands‑on experience in data engineering or related technical roles.
  • Strong proficiency in Python and SQL, with hands‑on experience in Airflow and Spark.
  • Working knowledge of Azure data ecosystem (Data Factory, AzureML, Synapse, Blob Storage, Azure SQL).
  • Familiarity with AWS data tools (S3, ECS, RDS, EC2).
  • Experience designing incremental, modular data pipelines and integrating APIs/external data sources.
  • Understanding of data governance, version control, and access management practices.
  • Excellent analytical, problem‑solving, and communication skills.
Must‑Have Technical Skills
  • Data pipeline development (Airflow, Azure Data Factory, Spark)
  • SQL and Python programming
  • AWS and/or Azure data platforms
  • ETL/ELT design, data modeling, data governance
  • API integrations and modular pipeline design
Preferred Skills
  • Knowledge of containerization (Docker, Kubernetes) for deployment of data solutions.
  • Familiarity with CI/CD integration for data workflows.
  • Exposure to telecom or large‑scale enterprise environments.
Other Relevant Information
  • Location: Kuala Lumpur, Malaysia
  • Engagement Type: Contract, 12‑mos
  • Schedule: Standard hours, full onsite
  • Client Industry: Telecommunications
  • Start Date: December 2025 (2‑week onboarding)
About us

Thakral One is a consulting and technology services company headquartered in Singapore, with a pan‑Asian presence. We focus primarily on technology‑driven consulting, adoption of value‑added bespoke solutions, enabling enhanced decision support through data analytics, and embracing possibilities in the cloud. We are heavily inclined towards building capabilities collaboratively with clients and believe strongly in improving grounded and practical outcomes. This approach is possible through our partnership with leading global technology providers and internal R&D teams. Our clients come from Financial Services, Banking, Telco, Government, Healthcare, and Consumer‑oriented organisations.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.