Job Search and Career Advice Platform

Enable job alerts via email!

Lead Data Engineer

Pos Malaysia

Kuala Lumpur

On-site

MYR 120,000 - 160,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading logistics company in Kuala Lumpur is seeking a highly motivated Data Engineer. In this role, you'll architect and implement scalable data solutions using AWS and develop robust ETL/ELT pipelines. You will collaborate closely with stakeholders to deliver user-friendly solutions and ensure the reliability and performance of data systems. The ideal candidate will have at least 8-10 years of experience in data engineering, with proficiency in SQL and Python. Join us to make a significant impact during this transformational journey.

Qualifications

  • 8-10 years of experience in data engineering, with 5 years in cloud environments (AWS preferred).
  • Experience managing data models and pipelines in large-scale environments.
  • Proficiency in AWS tools.
  • Strong SQL and Python skills.

Responsibilities

  • Architect and implement scalable data solutions using AWS.
  • Develop and maintain ETL/ELT pipelines for clean, structured datasets.
  • Ensure reliability and performance of data solutions.
  • Collaborate with stakeholders to deliver data solutions.

Skills

AWS services (S3, Glue, Redshift, Lambda, Step Functions)
SQL
Python
Data modeling
Data transformation
Problem-solving
Collaboration
Data governance

Tools

Power BI
Snowflake
Databricks
Job description

At Pos Malaysia, we're passionate about building trust to connect lives and businesses for a better tomorrow. As we transform this incredible 200-year-old business, we're seeking a highly motivated, engaged, and driven Data Engineer to join our team. If you're excited by transformation and the significant opportunity it represents, we encourage you to apply.

Own It
  • Architect and implement scalable data solutions using AWS services like S3, Glue, Redshift, Lambda, and Step Functions.
  • Develop and maintain robust ETL/ELT pipelines that transform raw data into clean, structured, and production-ready datasets.
  • Ensure the reliability and high performance of all data solutions from design to operation.
Build Trust
  • Establish and enforce strong data governance, security, and quality controls across all pipelines and datasets.
  • Write clean, well-documented code and champion best practices in testing, automation, and version control.
  • Consistently engage with internal counterparts in solving issues, fostering strong working relationships.
One Team
  • Collaborate closely with analysts, business stakeholders, and IT teams to understand data needs and deliver user-friendly solutions.
  • Contribute to a knowledge-sharing culture and help elevate the team's technical standards.
  • Develop positive working relationships with customers, partners, and other department staff.
Move Fast
  • Continuously refine and optimize data models (e.g., star schemas) to boost reporting efficiency.
  • Proactively identify and resolve performance bottlenecks and inefficiencies in data workflows.
  • Adapt, simplify, and act quickly based on dynamic business needs and market changes.
  • Enjoy working in a fast-moving transformation journey, embracing change and driving progress.
Delight the Customers
  • Build reliable datasets and pipelines that empower stakeholders to self-serve insights.
  • Translate business questions into effective data solutions that drive better decision-making and improved customer outcomes.
Requirements
  • Minimum 8-10 years in data engineering, with at least 5 years in cloud-based environments (preferably AWS).
  • Experience building and managing data models and pipelines for large-scale data environments.
  • Proficiency in AWS tools: S3, Glue, Redshift, Lambda, Step Functions.
  • Strong SQL and Python skills for data transformation and automation.
  • Hands-on experience with various data sources, including structured and unstructured data.
  • Solid grasp of data warehousing concepts, data modeling, and modern ELT/ETL practices.
  • Strong problem-solving skills and ability to work independently within a cross-functional team.
  • Ability to work independently while fostering teamwork and collaboration across various business units.
  • Proven ability to thrive in a fast-paced, dynamic, and transformative environment, embracing change and driving progress.
  • Strategic thinker with strong analytical skills and robust problem-solving abilities.
  • Excellent communication and negotiation skills, capable of engaging effectively with diverse stakeholders.
Preferred Requirements
  • Experience building data warehouses for a large enterprise.
  • Experience with Power BI, Snowflake on AWS, or Databricks.
  • Familiarity with DevOps practices such as CI/CD.
  • Understanding of AWS-based data security and governance best practices.

This is a super exciting time to be joining Pos Malaysia. Your contribution will help us to write the next chapter in our history.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.