Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Encora Technologies

Kuala Lumpur

On-site

MYR 100,000 - 150,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology firm in Kuala Lumpur is seeking a Senior Data Engineer with strong experience in cloud-based architectures and modern data platforms. The ideal candidate will have 7–8 years of hands-on experience with Snowflake or Databricks, and proficiency in SQL. Responsibilities include designing scalable data pipelines, optimizing data lake solutions, and collaborating with cross-functional teams. This role is critical for ensuring data quality and governance, making it an exciting opportunity for experienced professionals.

Qualifications

  • 7–8+ years of hands-on experience as a Data Engineer.
  • Strong experience in Snowflake and/or Databricks.
  • Expertise in SQL and performance tuning for large datasets.
  • Proficiency with scripting languages such as Python or Scala.
  • Experience with data modeling techniques.
  • Experience with CI/CD for data pipelines.

Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows.
  • Develop and optimize data lake and data warehouse solutions using Snowflake and/or Databricks.
  • Implement data ingestion, transformation, and processing frameworks for datasets.
  • Work closely with cross-functional teams to support data consumption needs.
  • Build reusable and modular data pipeline components.

Skills

Data pipeline design
Snowflake
Databricks
SQL performance tuning
Python
Data modeling techniques
CI/CD
Version control
Job description
Overview

We are looking for a Senior Data Engineer with strong experience in modern data platforms and cloud-based architectures. The ideal candidate has deep hands-on experience with Snowflake or Databricks, has built scalable data pipelines, and is proficient in Azure cloud services.

Key Responsibilities
  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows.
  • Develop and optimize data lake and data warehouse solutions using Snowflake and/or Databricks.
  • Implement data ingestion, transformation, and processing frameworks for structured and unstructured datasets.
  • Work closely with cross-functional teams (Data Analytics, Data Science, Product, Engineering) to support data consumption needs.
  • Build reusable and modular data pipeline components aligned with best practices.
  • Implement data quality validation, reconciliation, and monitoring controls.
  • Optimize performance for data storage, compute, and query execution.
  • Ensure adherence to data governance, security standards, and compliance requirements.
  • Participate in solution architecture discussions and contribute to technical design decisions.
Required Skills & Experience
  • 7–8+ years of hands-on experience as a Data Engineer.
  • Strong experience in Snowflake and/or Databricks (any one is fine, both preferred).
  • Expertise in SQL and performance tuning for large datasets.
  • Proficiency with scripting languages such as Python or Scala.
  • Experience with data modeling techniques (star schema, normalized models, dimensional modeling).
  • Experience with CI/CD for data pipelines and version control (Git).
  • Exposure to data governance, metadata management, and data quality frameworks.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.