Enable job alerts via email!

Senior Data Platform Architect (GCP Expert)

Klook (客路旅行)

Singapore

On-site

USD 90,000 - 150,000

Full time

3 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join an innovative firm as a Data Platform Architect, where you'll leverage cutting-edge technologies to design and optimize scalable data platforms. This role offers the opportunity to work with modern data tools like Flink and Kafka, ensuring seamless integration with AI/ML pipelines on Google Cloud Platform. Collaborate with cross-functional teams, mentor engineers, and drive best practices in cloud-native architectures. If you're passionate about transforming data into actionable insights and thrive in a dynamic environment, this position is perfect for you.

Qualifications

  • 8+ years in data engineering, with 3 years on GCP.
  • Proficiency in real-time data tools and strong SQL/Python skills.

Responsibilities

  • Design and implement scalable, real-time data solutions.
  • Lead GCP-based data ingestion and ML pipelines.

Skills

Apache Flink
Kafka
Google Cloud Platform (GCP)
SQL
Python
Java
Scala
Data Governance

Education

Bachelor's Degree in Computer Science or related field

Tools

BigQuery
Dataflow
Pub/Sub
Dataproc
Vertex AI
Data Catalog

Job description

We are seeking a seasoned Data Platform Architect with deep expertise in modern data technologies (e.g., Flink, Kafka, Lakehouse, Paimon, Doris, BigQuery) and Google Cloud Platform (GCP). You will design, build, and optimize scalable data platforms, ensuring seamless integration with data ingestion, warehousing, AI/ML pipelines (Vertex AI), and data governance (Data Catalog). The ideal candidate is a hands-on leader who can bridge technical execution with strategic vision.

What you'll do

Architect Modern Data Platforms

  • Design and implement scalable, real-time data solutions using Apache Flink, Kafka, Lakehouse architectures, Apache Paimon, Apache Doris, and BigQuery.
  • Optimize data pipelines for batch/streaming workflows, ensuring low-latency and high throughput.

GCP-Centric Data Engineering

  • Lead GCP-based data ingestion (Pub/Sub, Dataflow), warehousing (BigQuery, Dataproc), and ML pipelines (Vertex AI, TFX).
  • Implement DataOps/MLOps practices for CI/CD, monitoring, and governance.

Data Governance & Cataloging

  • Deploy and manage data catalogs (e.g., Data Catalog, Collibra, Alation) for metadata management, lineage, and compliance.
  • Enforce data quality, security, and access controls.

Cross-Functional Leadership

  • Collaborate with AI/ML teams to productionize models and embed analytics into business processes.
  • Mentor engineers and evangelize best practices in cloud-native data architectures.

What you'll need

  • 8+ years in data engineering, architecture, or solutions roles, with at least 3 years focused on GCP.
  • Proficiency in real-time data tools: Flink, Kafka, Paimon, Doris, Spark.
  • Hands-on experience with GCP services: BigQuery, Dataflow, Pub/Sub, Dataproc, Vertex AI, Cloud Storage.
  • Strong SQL/Python/Java/Scala skills; familiarity with data lakehouse frameworks (Delta Lake, Iceberg, Hudi).
  • Experience with data catalogs (e.g., GCP Data Catalog, OpenMetadata) and metadata management.
  • Proven track record designing large-scale data platforms (10M+ rows/day).
  • Ability to translate business needs into technical solutions.
  • Certifications like GCP Professional Data Engineer or AWS/Azure equivalents are a plus.

Nice-to-Have

  • Knowledge of multi-cloud integrations (e.g., AWS S3 + GCP).
  • Exposure to LLM pipelines, vector databases, or generative AI workflows.

P.S. Mention Tech in Asia Jobs when you apply! Helps keep the good stuff coming

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.