Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Pos Malaysia

Kuala Lumpur

On-site

MYR 80,000 - 120,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading logistics company in Kuala Lumpur seeks a motivated Data Engineer to architect and implement scalable data solutions using AWS services. The role requires expertise in developing ETL/ELT pipelines, understanding data governance, and collaborating with cross-functional teams. Candidates should have 2-4 years in data engineering, strong SQL and Python skills, and experience with AWS tools. This is an exciting opportunity to contribute to an evolving company culture.

Qualifications

  • Minimum 2-4 years in data engineering, with at least 1 year in cloud environments.
  • Proficiency in AWS tools like S3, Glue, Redshift, Lambda.
  • Strong SQL and Python skills for data transformation.

Responsibilities

  • Architect and implement scalable data solutions using AWS services.
  • Develop and maintain ETL/ELT pipelines.
  • Collaborate with analysts and business stakeholders to understand data needs.

Skills

Data modeling
SQL
Python
Data transformation
Collaboration
Problem-solving

Education

2-4 years in data engineering
Cloud-based experience (preferably AWS)

Tools

AWS S3
AWS Glue
AWS Redshift
AWS Lambda
AWS Step Functions
Power BI
Snowflake
Databricks
Job description

At Pos Malaysia, we're passionate about building trust to connect lives and businesses for a better tomorrow. As we transform this incredible 200-year-old business, we're seeking a highly motivated, engaged, and driven Data Engineer to join our team. If you're excited by transformation and the significant opportunity it represents, we encourage you to apply.

Own It
  • Architect and implement scalable data solutions using AWS services like S3, Glue, Redshift, Lambda, and Step Functions.
  • Develop and maintain robust ETL/ELT pipelines that transform raw data into clean, structured, and production-ready datasets.
  • Ensure the reliability and high performance of all data solutions from design to operation.
Build Trust
  • Establish and enforce strong data governance, security, and quality controls across all pipelines and datasets.
  • Write clean, well-documented code and champion best practices in testing, automation, and version control.
  • Consistently engage with internal counterparts in solving issues, fostering strong working relationships.
One Team
  • Collaborate closely with analysts, business stakeholders, and IT teams to understand data needs and deliver user-friendly solutions.
  • Contribute to a knowledge-sharing culture and help elevate the team's technical standards.
  • Develop positive working relationships with customers, partners, and other department staff.
Move Fast
  • Continuously refine and optimize data models (e.g., star schemas) to boost reporting efficiency.
  • Proactively identify and resolve performance bottlenecks and inefficiencies in data workflows.
  • Adapt, simplify, and act quickly based on dynamic business needs and market changes.
  • Recommend and experiment with new tools or AWS services to enhance engineering efficiency and data capabilities, leading to more robust and future-proof data solutions.
  • Automate manual processes and champion continuous improvement in data engineering practices, freeing up time for strategic initiatives and accelerating delivery.
  • Promote a culture of continuous learning and innovation within the team.
Delight the Customers
  • Build reliable datasets and pipelines that empower stakeholders to self-serve insights.
  • Translate business questions into effective data solutions that drive better decision-making and improved customer outcomes.
Requirements
  • Minimum 2-4 years in data engineering, with at least 1 year in cloud-based environments (preferably AWS).
  • Experience building and managing data models and pipelines for large-scale data environments.
  • Proficiency in AWS tools: S3, Glue, Redshift, Lambda, Step Functions.
  • Strong SQL and Python skills for data transformation and automation.
  • Hands‑on experience with various data sources, including structured and unstructured data.
  • Solid grasp of data warehousing concepts, data modeling, and modern ELT/ETL practices.
  • Strong problem‑solving skills and ability to work independently within a cross‑functional team.
  • Ability to work independently while fostering teamwork and collaboration across various business units.
  • Proven ability to thrive in a fast‑paced, dynamic, and transformative environment, embracing change and driving progress.
  • Strategic thinker with strong analytical skills and robust problem‑solving abilities.
  • Excellent communication and negotiation skills, capable of engaging effectively with diverse stakeholders.
Preferred requirements
  • Experience building data warehouses for a large enterprise.
  • Experience with Power BI, Snowflake on AWS, or Databricks.
  • Familiarity with DevOps practices such as CI/CD.
  • Understanding of AWS‑based data security and governance best practices.

This is a super exciting time to be joining Pos Malaysia. Your contribution will help us to write the next chapter in our history.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.