Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Lenovo

Remote

MYR 120,000 - 160,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading technology company is seeking a Staff Data Engineer to develop and maintain scalable data pipelines. You will ensure data quality, governance, and support AI capabilities within a next-generation hybrid-cloud management platform. The ideal candidate has over 5 years of experience in data engineering, proficient in Python and SQL, and familiar with cloud data platforms like Azure, AWS, or GCP. This is an opportunity to work on innovative AI-driven solutions that optimize business operations globally.

Qualifications

  • 5+ years experience in data engineering and delivering reliable, scalable data solutions.
  • Proficiency in Python and SQL; familiarity with distributed data systems.
  • Hands-on experience with cloud data platforms like Azure, AWS, GCP.

Responsibilities

  • Build, optimize, and maintain scalable data pipelines for both structured and unstructured data.
  • Ensure compliance with data quality, governance, and security standards.
  • Support AI-specific capabilities like prompt management and model improvement.

Skills

Python
SQL
Data engineering
Problem-solving
Cross-functional collaboration

Tools

Airflow
Kafka
PySpark
Azure
AWS
GCP
Job description

We’re building a next-generation hybrid-cloud management platform that combines Generative AI and classical ML to help businesses optimize cost, reliability, and operations. As Staff Data Engineer, you’ll be responsible for delivering robust, scalable data solutions—ensuring our data is clean, well‑organized, and ready to power analytics, AI models, and intelligent agents.

You’ll work closely with Backend, Product, and AI teams to make data accessible, reliable, and governed, while enabling advanced AI capabilities like prompt management, memory, and fine‑tuning. You’ll focus on implementing and optimizing data pipelines, supporting production workloads, and driving continuous improvements in data quality and performance.

What You’ll Do

Build, optimize, and maintain scalable data pipelines and core data assets across the platform for both structured and unstructured data.

Implement data quality, governance, and security standards, ensuring compliance with privacy regulations.

Prepare and organize data for analytics, ML, and AI agents, including conversational and behavioral datasets.

Support AI‑specific needs such as prompt logging, memory management, and collecting data for model improvement.

Collaborate with cross‑functional teams to ensure data is discoverable, documented, and optimized for performance and cost.

Troubleshoot, monitor, and enhance data workflows to ensure reliability and efficiency in production environments.

What We’re Looking For

5+ years experience in data engineering and delivering reliable, scalable data solutions.

Proficiency in Python and SQL; familiarity with distributed data systems and modern data tools (e.g., Airflow, Kafka, PySpark).

Working knowledge of multiple database types, including relational, document, vector, and graph databases.

Understanding of data governance, privacy, and compliance best practices.

Experience with data pipelines for supervised ML and Agentic AI is highly desirable.

Strong problem‑solving skills and ability to work effectively across teams.

Experience supporting production data systems and optimizing for performance and cost.

Hands‑on experience with cloud data platforms (Azure, AWS, GCP).

Why Join Us?

You’ll play a key role in building and delivering the data backbone of a platform that enables smarter, cost‑efficient operations for businesses worldwide. This is an opportunity to work on cutting‑edge AI‑driven solutions, drive delivery excellence, and make a tangible impact on our product and customers.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.