Job Search and Career Advice Platform

Enable job alerts via email!

Data Platform Engineer - Build Scalable Data Pipelines

UST Global

Selangor

On-site

MYR 80,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading global technology service provider is seeking a Data Engineer to design and maintain scalable data systems that support analytics and business intelligence efforts. The ideal candidate will have a strong proficiency in Python and SQL, with at least 5 years of relevant experience. Responsibilities include building data pipelines and architectures, ensuring data quality, and collaborating with stakeholders to drive data-driven decision-making. The role requires familiarity with cloud platforms and data pipeline tools, making it essential for the candidate to possess solid analytical and communication skills.

Qualifications

  • Strong proficiency in Python and SQL.
  • Hands-on experience with data pipeline tools (e.g., Airflow, DBT, Spark, Kafka).
  • Experience working with data lakes, data warehouses, and big data technologies.
  • Familiarity with cloud platforms (AWS, Azure, or Google Cloud).
  • Understanding of Agile/Scrum methodologies.
  • Strong analytical and problem-solving skills.
  • Good communication and collaboration abilities.

Responsibilities

  • Design, develop, and maintain scalable data models, ETL/ELT pipelines, and data workflows.
  • Build and optimize data architectures including data lakes, data warehouses, and big data platforms.
  • Develop and maintain data-centric applications and services.
  • Integrate structured and unstructured data from multiple sources.
  • Ensure data quality, integrity, governance, and security standards are maintained.
  • Support and implement cloud-based data solutions (AWS, Azure, or GCP).
  • Collaborate with data analysts, BI developers, and business stakeholders.
  • Monitor and troubleshoot data pipeline performance and system issues.
  • Participate in Agile/Scrum ceremonies and contribute to continuous improvement initiatives.

Skills

Python
SQL
Data pipeline tools (Airflow, DBT, Spark, Kafka)
Data lakes and data warehouses
Cloud platforms (AWS, Azure, Google Cloud)
Agile/Scrum methodologies
Analytical skills
Problem-solving skills
Communication skills
Collaboration abilities

Education

Bachelor’s degree in Information Technology, Computer Science, Engineering, Mathematics

Tools

Airflow
DBT
Spark
Kafka
Job description
A leading global technology service provider is seeking a Data Engineer to design and maintain scalable data systems that support analytics and business intelligence efforts. The ideal candidate will have a strong proficiency in Python and SQL, with at least 5 years of relevant experience. Responsibilities include building data pipelines and architectures, ensuring data quality, and collaborating with stakeholders to drive data-driven decision-making. The role requires familiarity with cloud platforms and data pipeline tools, making it essential for the candidate to possess solid analytical and communication skills.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.