Enable job alerts via email!

Sr Cloud Data Architect (GCP)

ZipRecruiter

London

On-site

GBP 80,000 - 120,000

Full time

22 days ago

Job summary

A leading company in the tech industry seeks a Senior Cloud Data Architect for a long-term contract in London. The role involves leading data migration projects and creating data pipelines using Google Cloud’s advanced services. The ideal candidate should possess a strong background in data architecture, cloud data services, and relevant certifications with substantial experience in the field.

Qualifications

  • 12+ years in data architecture and engineering.
  • 8+ years hands-on experience as Data Engineer on GCP.
  • Google Professional Data Engineer certification (Mandatory).

Responsibilities

  • Lead large-scale data migration programs.
  • Design automated, production-grade data pipelines.
  • Engage with enterprise customers for data transformation.

Skills

Leadership
Data Migration
Cloud Data Services
Data Pipeline Design
SQL
Problem Solving

Education

Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related technical field

Tools

Google Cloud Platform (GCP)
Apache Airflow
GIT

Job description

Job Description

Sr Cloud Data Architect

Location: London, UK

Duration: Long-term contract

Senior Cloud Data Architect. The ideal candidate will have extensive experience leading large-scale data migration programs and designing automated, production-grade data pipelines across various industries. This strategic role involves deep architectural engagement with enterprise customers, driving transformation through Google Cloud's advanced data services such as Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.

Required Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related technical field.
  • 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs.
  • 8+ years of hands-on experience as a Data Engineer, with at least 3+ years specifically working with Google Cloud Platform (GCP) data services.
  • Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
  • Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
  • Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
  • Proficiency in at least one scripting/programming (e.g., Python, Java, Scala) for data manipulation and pipeline development.
  • Understanding of data warehousing and data lake concepts and best practices.
  • Experience with version control systems (e.g., Git).
  • 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.
  • Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer.
  • Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
  • Proficiency in at least one scripting/programming (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases.
  • Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies.
  • Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
  • Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
  • Experience with version control systems (e.g., Git).
  • Track record of success advising C-level executives and aligning technical solutions with business goals.
  • Google Professional Data Engineer certification (Mandatory).
  • Google Professional Cloud Architect certification or equivalent.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.