Enable job alerts via email!

GCP Senior Data Engineer

Endava

Kuala Lumpur

Hybrid

MYR 80,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global tech consulting firm in Kuala Lumpur is seeking a skilled Data Engineer to build and optimize data solutions on Google Cloud Platform. The successful candidate will collaborate with data scientists to enhance data flows and oversee production environments. The role demands strong proficiency in Python, SQL, and big data frameworks, with a preference for candidates fluent in Cantonese. This position offers a competitive salary and various career development opportunities.

Benefits

Competitive salary package
Career coaching and development
Flexible working hours
Global wellbeing programmes

Qualifications

  • Proven experience as a Data Engineer, with expertise in GCP services.
  • Strong knowledge of data architecture, modeling, and ETL/ELT processes.
  • Hands-on experience with big data frameworks and modern data tools.

Responsibilities

  • Collaborate with Data Analysts and Data Scientists to design data flows.
  • Design and implement cloud-based architecture and deployment processes.
  • Monitor and maintain the production environment for data quality.

Skills

Data architecture
Data modeling
ETL/ELT processes
Python
Apache Spark
SQL scripting
Big data frameworks
Machine learning
Collaboration

Education

5+ years of experience in Data Engineering

Tools

Google Cloud Platform
Snowflake
Spark
Hadoop
Job description

Technology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change.

By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses.

From prototype to real-world impact - be part of a global shift by doing work that matters.

Job Description

About the Role

We are looking for a skilled Data Engineer with strong expertise in Google Cloud Platform (GCP). In this role, you will play a key part in building, optimizing, and maintaining data solutions that support advanced analytics, machine learning, and data-driven decision-making. Fluency in Cantonese is a strong advantage and will help you collaborate more effectively with local stakeholders.

Key Responsibilities

Collaborate with Data Analysts and Data Scientists to understand requirements and design efficient data flows, pipelines, and interactive reports.

Work closely with stakeholders to understand how data is used across different teams and propose improvements.

Design and implement cloud-based architecture and deployment processes on GCP.

Build and maintain data pipelines, transformations, and metadata to support business needs.

Create solutions for Relational and Dimensional data models aligned with platform capabilities.

Develop, test, and optimize big data solutions to ensure scalability and performance.

Monitor and maintain the production environment, ensuring data quality, reliability, and integrity.

Lead initiatives to improve data quality, governance, security, and compliance.

Requirements

Proven experience as a Data Engineer, with expertise in GCP services (e.g., BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage).

Strong knowledge of data architecture, modeling, and ETL/ELT processes.

Hands-on experience with big data frameworks and modern data tools.

Fluency in Cantonese (spoken and written) is a strong advantage and will help you collaborate effectively with local stakeholders.

Strong communication and collaboration skills.

Familiarity with machine learning, AI, and advanced analytics is a plus.

Qualifications

5+ years of experience in Data Engineering.

Strong proficiency in Python and Apache Spark.

Hands‑on experience designing and implementing ETL/ELT processes and data pipelines.

Solid expertise in SQL scripting and query optimization.

Experience with Snowflake or other modern cloud data platforms.

Background in cloud data technologies and tools, with exposure to:

Data processing frameworks (Spark, Hadoop, Apache Beam, Dataproc, or similar)

Data warehouses (BigQuery, Redshift, or equivalent)

Real‑time streaming pipelines (Kinesis, Kafka, or similar)

Batch and serverless data processing

Strong analytical skills with the ability to work both structured and unstructured data.

Experience in leading IT projects and managing stakeholder expectations.

Additional Information

Discover some of the global benefits that empower our people to become the best version of themselves:

  • Finance: Competitive salary package, share plan, company performance bonuses, value‑based recognition awards, referral bonus;
  • Career Development: Career coaching, global career opportunities, non‑linear career paths, internal development programmes for management and technical leadership;
  • Learning Opportunities: Complex projects, rotations, internal tech communities, training, certifications, coaching, online learning platforms subscriptions, pass‑it‑on sessions, workshops, conferences;
  • Work‑Life Balance: Hybrid work and flexible working hours, employee assistance programme;
  • Health: Global internal wellbeing programme, access to wellbeing apps;
  • Community: Global internal tech communities, hobby clubs and interest groups, inclusion and diversity programmes, events and celebrations.

At Endava, we’re committed to creating an open, inclusive, and respectful environment where everyone feels safe, valued, and empowered to be their best. We welcome applications from people of all backgrounds, experiences, and perspectives—because we know that inclusive teams help us deliver smarter, more innovative solutions for our customers. Hiring decisions are based on merit, skills, qualifications, and potential. If you need adjustments or support during the recruitment process, please let us know.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.