Enable job alerts via email!

Data Engineer

ThunderSoft

Penang

On-site

MYR 100,000 - 150,000

Full time

16 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in the tech sector seeks an experienced Data Engineer to modernize data infrastructure and drive innovation in analytics, engineering, and AI/ML workloads. The successful candidate will design robust data pipelines, build cloud-based platforms, and ensure data reliability and governance. Ideal for those with strong programming skills in Python and SQL, along with hands-on experience in technologies like Snowflake and Databricks.

Benefits

Flexible working hours
Health insurance
Professional development opportunities

Qualifications

  • 3+ years in data engineering with large-scale systems.
  • Deep expertise in Python and SQL required.
  • Experience with cloud-based platforms like Snowflake and Databricks.

Responsibilities

  • Design and maintain scalable data pipelines using Python and SQL.
  • Build and optimize cloud-based data platforms with Snowflake/Databricks.
  • Collaborate with teams to turn requirements into data solutions.

Skills

Data Engineering
Python
SQL
ETL
API Development
Data Modeling

Education

Bachelor’s or Master’s in Computer Science, Information Systems, Engineering

Tools

Snowflake
Databricks
Informatica
Talend
Azure
AWS
GCP

Job description

Summary

This role is ideal for someone with strong expertise in Snowflake and/or Databricks, advanced Python and SQL skills, and experience building scalable data pipelines and high-performance, low-latency APIs. Require to modernize our data infrastructure and drive innovation across analytics, engineering, and AI/ML workloads.

Key Responsibilities

  • Design, build, and maintain robust and scalable data pipelines using Python, SQL, and ETL tools like Informatica, ADF, SSIS, or Talend.
  • Architect and operationalize cloud-based data platforms leveraging Snowflake and/or Databricks for storage, transformation, and analytics.
  • Build high-performance, low-latency APIs to enable real-time data access and power scalable applications.
  • Collaborate with cross-functional teams to gather business requirements and translate them into efficient, production-grade data engineering solutions.
  • Design and implement data models, including dimensional modeling, to support reporting and advanced analytics use cases.
  • Optimize SQL performance and automate data workflows through CI/CD pipelines and orchestration frameworks.
  • Integrate structured, semi-structured, and unstructured data from diverse sources including APIs and large transactional systems.
  • Ensure data reliability, quality, and governance using tools like Alation or Talend DQ.
  • Support data science teams with engineered datasets to power AI/ML/LLM models and business insights.

Required Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or related field.
  • 3+ years of experience in data engineering, working with large-scale data systems and complex transformation logic.
  • Strong programming skills in Python and deep expertise in SQL.
  • Proven experience with Snowflake and/or Databricks, including performance tuning, workload optimization, and cloud-native architecture.
  • Hands-on experience with Informatica or other modern ETL tools (ADF, Talend, SSIS).
  • Experience building and managing RESTful APIs to support real-time applications.
  • Solid understanding of data modeling, data warehousing, and data architecture best practices.
  • Familiarity with data governance, quality validation, and metadata cataloging tools.
  • Comfortable working in agile teams and driving deliverables independently.

Preferred Qualifications

  • Experience with Snowflake Cortex for AI/ML and LLM integration is a strong plus.
  • Familiarity with Spark, Delta Lake, or similar big data frameworks.
  • Exposure to AI/LLM model pipelines, prompt engineering, or chatbot data ingestion.
  • Certifications in Snowflake, Databricks, or cloud platforms (Azure, AWS, GCP).
  • Experience in semiconductor or high-tech industries is a bonus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.