Enable job alerts via email!

Sr. Officer-Data Engineer - GCP Platform

Indosat

Indonesia

On-site

IDR 200.000.000 - 300.000.000

Full time

3 days ago
Be an early applicant

Job summary

A leading telecommunications company in Indonesia seeks a skilled Data Engineer to enhance their data platform on Google Cloud Platform (GCP). Responsibilities include building a global-standard data warehouse, optimizing data pipelines, and implementing MLOps best practices. The ideal candidate has over 5 years of experience with GCP tools and strong problem-solving skills.

Qualifications

  • 5+ years of experience as Data Engineer with strong proficiency in GCP data tools.
  • Solid understanding of data warehousing principles.
  • Experience in implementing data governance frameworks.

Responsibilities

  • Design, build, and maintain a scalable, global-standard data warehouse on GCP.
  • Create clean, well-structured data marts.
  • Build and maintain ETL pipeline ensuring timely data.

Skills

GCP data tools
ETL processes
MLOps practices
Collaboration skills
Problem-solving

Tools

BigQuery
Dataflow
Composer
Airflow
Job description

Press Tab to Move to Skip to Content Link

Location:

ID

Employment Status: Permanent

Description:

We’re looking for askilled Data Engineerto join our team and take ownership of our data platform built on Google Cloud Platform (GCP). You’ll play a key role in building robust data infrastructure, enabling data governance, supporting machine learning operations, and ensuring cost-effective, high-quality data pipelines.

What You’ll Do

  • Design, build, and maintain a scalable, global-standard data warehouse on GCP.
  • Create clean, well-structured data marts to support business users and dashboards.
  • Build and maintain ETL pipeline using Airflow or equivalent tools to ensure timely and high-quality data.
  • Implement and manage data governance tools and services (data catalog, quality, user access control, PII encryption mechanism, etc)
  • Administer GCP infrastructure, enabling and optimizing services for the data platform.
  • Monitor, manage and optimize GCP / data platform costs to ensure efficiency.
  • Implement MLOps best practices, from feature engineering to deployment, monitoring, and retraining.

What We’re Looking For

  • 5+ years of experience as Data Engineer with strong proficiency in GCP data tools (BigQuery, Dataflow, Composer, etc.).
  • Solid understanding of data warehousing principles, dimensional modelling, and ETL processes.
  • Familiarity with DevOps practices and CI/CD tools for cloud environments.
  • Experience in implementing data governance frameworks and tools.
  • Familiarity with infrastructure management and cost optimization on GCP.
  • Hands-on experience with MLOps practices, workflows and machine learning model lifecycle on GCP.
  • Excellent problem-solving skills and attention to detail.
  • Strong collaboration skills with both technical and business stakeholders.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.