Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Encora Technologies Sdn Bhd

Kuala Lumpur

On-site

MYR 100,000 - 150,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech company in Kuala Lumpur seeks an experienced Data Engineer to design and maintain scalable data pipelines. You will optimize data lake solutions using Snowflake or Databricks, collaborate with cross-functional teams, and ensure data quality and compliance standards. Candidates should have 7-8+ years of experience with strong expertise in SQL, data modeling, and scripting. This role offers the opportunity to work on innovative data solutions.

Qualifications

  • 7–8+ years of hands-on experience as a Data Engineer.
  • Strong experience in Snowflake and/or Databricks.
  • Expertise in SQL and performance tuning for large datasets.
  • Proficiency with scripting languages such as Python or Scala.
  • Experience with data modeling techniques.
  • Experience with CI/CD for data pipelines.
  • Exposure to data governance and quality frameworks.

Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows.
  • Develop and optimize data lake and warehouse solutions.
  • Implement data ingestion and processing frameworks.
  • Work closely with cross-functional teams.
  • Build reusable data pipeline components.
  • Implement data quality validation and monitoring.
  • Optimize performance for data storage and queries.
  • Ensure adherence to data governance and compliance.
  • Participate in solution architecture discussions.

Skills

Data Engineering
Snowflake
Databricks
SQL
Python
Scala
CI/CD
Data Governance
Job description

Add expected salary to your profile for insights

Key Responsibilities
  • Design, build, and maintain scalable data pipelines and ETL/ELT workflows.
  • Develop and optimize data lake and data warehouse solutions using Snowflake and/or Databricks.
  • Implement data ingestion, transformation, and processing frameworks for structured and unstructured datasets.
  • Work closely with cross-functional teams (Data Analytics, Data Science, Product, Engineering) to support data consumption needs.
  • Build reusable and modular data pipeline components aligned with best practices.
  • Implement data quality validation, reconciliation, and monitoring controls.
  • Optimize performance for data storage, compute, and query execution.
  • Ensure adherence to data governance, security standards, and compliance requirements.
  • Participate in solution architecture discussions and contribute to technical design decisions.
Required Skills & Experience
  • 7–8+ years of hands-on experience as a Data Engineer.
  • Strong experience in Snowflake and/or Databricks (any one is fine, both preferred).
  • Expertise in SQL and performance tuning for large datasets.
  • Proficiency with scripting languages such as Python or Scala.
  • Experience with data modeling techniques (star schema, normalized models, dimensional modeling).
  • Experience with CI/CD for data pipelines and version control (Git).
  • Exposure to data governance, metadata management, and data quality frameworks.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.