Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer

Elitez Pte Ltd

Petaling Jaya

On-site

MYR 90,000 - 120,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology firm in Selangor is looking for a skilled Data Engineer to design and maintain data pipelines and optimize data models using GCP services. The ideal candidate will have 4 to 8 years of experience in data engineering with strong SQL and Python skills, as well as hands-on experience with the GCP data stack. This position provides benefits such as a 13th-month bonus and staff insurance.

Benefits

13th Month Bonus
Staff Insurance Provided

Qualifications

  • 4 – 8 years of hands-on experience in data engineering, ETL/ELT development, or related roles.
  • Strong SQL and Python skills for data transformation and automation.
  • Hands-on experience with GCP Data Stack (BigQuery, Dataflow, Composer, Dataproc, Dataform).
  • Familiarity with orchestration workflows (Airflow/Composer) and CI/CD for data pipelines.

Responsibilities

  • Design, build, and maintain batch and streaming data pipelines using GCP services.
  • Implement and optimize data models in BigQuery for analytics and BI reporting.
  • Connect and transform data from various sources including APIs and databases.
  • Monitor and troubleshoot data pipelines for high availability and cost efficiency.
  • Implement data validation and security best practices.

Skills

SQL
Python
GCP Data Stack
ETL/ELT development
Data modeling
Job description

Data Engineer (Backend + Data Warehousing)

GoGeek Sdn Bhd

25d ago

Key Responsibilities
  • Pipeline Development: Design, build, and maintain batch and streaming data pipelines using GCP services such as BigQuery, Dataflow, Dataproc, Composer, Dataform, and Cloud Functions.
  • Data Modeling & Optimization: Implement and optimize data models in BigQuery to support analytics, BI reporting, and machine learning workloads.
  • Data Integration: Connect and transform data from multiple sources, including APIs, databases, event streams, and flat files.
  • Platform Reliability: Monitor and troubleshoot data pipelines, ensuring high availability, scalability, and cost efficiency.
  • Governance & Quality: Implement data validation, quality checks, and security best practices to ensure trusted data.
  • Collaboration: Work closely with analysts, BI developers (Tableau, MicroStrategy), and business teams to enable reporting and self-service analytics.
  • Legacy Support (Light): Provide occasional support for legacy systems (Oracle, MicroStrategy) where needed, focusing on data extraction and gradual modernization.
Benefits
  • 13th Month Bonus
  • Staff Insurance Provided
Key Requirements
  • Experience: 4 – 8 years of hands‑on experience in data engineering, ETL/ELT development, or related roles.
  • Strong SQL and Python skills for data transformation and automation.
  • Hands‑on experience with GCP Data Stack (BigQuery, Dataflow, Composer, Dataproc, Dataform).
  • Familiarity with orchestration workflows (Airflow/Composer) and CI/CD for data pipelines.
  • Understanding of relational, dimensional, and modern data modeling concepts, with an eye for performance optimization.
  • Exposure to Azure Data Stack (Synapse, Data Factory, Databricks) is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.