Enable job alerts via email!

Senior Data Engineer (Dbt, airflow)

MOL AccessPortal

Kuala Lumpur

On-site

MYR 120,000 - 160,000

Full time

Yesterday
Be an early applicant

Job summary

A technology company in Kuala Lumpur seeks an experienced Data Engineer to lead the design and development of scalable data pipelines. The successful candidate will have expertise in Python and SQL, with hands-on experience in big data frameworks like Apache Spark and AWS Redshift. The role involves collaborating with stakeholders to deliver reliable data infrastructure for AI/ML workloads. This opportunity offers growth in a dynamic environment.

Qualifications

  • 5+ years of experience in data engineering focusing on scalable data architectures.
  • Hands-on experience with AWS Redshift, Apache Airflow, and DBT.
  • Strong experience with big data frameworks: Apache Spark, Flink, Kafka.
  • Solid understanding of Linux, Docker, and Kubernetes.
  • At least one cloud platform experience.

Responsibilities

  • Lead the design and development of data pipelines for analytics and AI/ML workloads.
  • Build and maintain data architectures using tools like Redshift, Spark, and Kafka.
  • Implement and optimize data orchestration workflows.
  • Collaborate with data scientists and business stakeholders.
  • Support AI/ML initiatives with reliable data infrastructure.

Skills

Python
SQL
Data governance
Apache Spark
Apache Airflow
AWS Redshift
Linux
Docker
Kubernetes
AI/ML workflows

Education

Bachelor's degree in Computer Science, Data Engineering, Statistics, or related field

Tools

DBT (Data Build Tool)
Apache Flink
Apache Kafka
Job description

Add expected salary to your profile for insights

Lead the design and development of robust, scalable data pipelines for both traditional analytics and AI/ML workloads

Build and maintain data architectures including data warehouses, data lakes, and real-time streaming solutions using tools like Redshift, Spark, Flink, and Kafka

Implement and optimize data orchestration workflows using Airflow and data transformation processes using DBT

Develop automated data workflows and integrate with DevOps/MLOps frameworks using Docker, Kubernetes, and cloud infrastructure

Implement best practices for data governance, including data quality, security, compliance, data lineage, and access control

Collaborate with data scientists, analysts, and business stakeholders to understand technical requirements and deliver reliable data infrastructure

Demonstrate strong business sensitivity to ensure data solutions align with business objectives and requirements

Support AI/ML initiatives by building feature stores, vector databases, and real-time inference pipelines

Continuously explore and adopt new technologies in data engineering and AI/ML space

Proactively drive new initiatives and mentor junior team members

Key Qualifications:

Bachelor's degree in Computer Science, Data Engineering, Statistics, or related field

5+ years of experience in data engineering with focus on scalable data architectures

Expert proficiency in Python and SQL programming languages

Hands-on experience with AWS Redshift, Apache Airflow, and DBT (Data Build Tool)

Strong experience with big data frameworks: Apache Spark, Apache Flink, and Apache Kafka

Solid understanding of Linux, Docker, and Kubernetes for containerization and orchestration

At least one cloud platform experience (AWS preferred, but GCP or Azure acceptable)

Proven experience in dimensional modeling design and implementation

Strong business acumen with sensitivity to business requirements and ability to translate them into robust technical data solutions

Fluent in English (reading, writing, and verbal communication)

Experience in data governance including data quality, security, access management, and data lineage

Foundational knowledge of AI/ML workflows, model deployment pipelines, and LLM integration patterns

Demonstrated ability to lead technical initiatives and drive adoption of new technologies independently

Strong analytical and communication skills with experience working across cross-functional teams

Nice to Have:

Experience with OpenMetadata for data catalog and governance

Experience in gaming, e-commerce, or fintech industries

Software Engineer - .Net & AI Integration

CardSys Sdn Bhd

GOOD DRIVER MUTUALITY SDN. BHD.

Kuala Lumpur City Centre, Kuala Lumpur, MY

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.