Enable job alerts via email!

Data Architect - GCP

New Era Solutions

Bengaluru

On-site

INR 30,00,000 - 50,00,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology solutions provider is seeking a Data Architect – GCP to lead the design and modernization of enterprise data platforms. This role involves architecting scalable data solutions using GCP services and guiding data engineering teams. The ideal candidate will have over 10 years of experience in data architecture with strong GCP skills, emphasized leadership, and stakeholder management capabilities. This position is based in Bengaluru and offers a challenging environment for professionals focused on innovation.

Qualifications

  • 10+ years of experience in data architecture and engineering.
  • 4+ years of hands-on GCP experience.
  • Proven leadership in client-facing roles.

Responsibilities

  • Design enterprise-scale data architectures using GCP services.
  • Lead implementation of medallion architecture patterns.
  • Manage distributed team of data engineers.

Skills

Data architecture and engineering
GCP (BigQuery, Dataflow, Cloud Composer, Dataform)
Python
SQL
Stakeholder management

Education

GCP Professional Data Engineer or Cloud Architect certification

Tools

Cloud Composer
Dataflow
BigQuery
Atlan
Collibra
Job description

AuxoAI is hiring a Data Architect – GCP to lead enterprise data platform design, architecture modernization, and solution delivery across global client engagements. In this client-facing role, you will architect scalable data platforms using GCP-native services, guide onshore/offshore data engineering teams, and define best practices across ingestion, transformation, governance, and consumption layers.

This role is ideal for someone who combines deep GCP platform expertise with leadership experience, and is confident working with both engineering teams and executive stakeholders.

Responsibilities
  • Design and implement enterprise-scale data architectures using GCP services, with BigQuery as the central analytics platform
  • Lead end-to-end implementation of medallion architecture (Raw → Processed → Curated) patterns
  • Oversee data ingestion pipelines using Cloud Composer, Dataflow (Apache Beam), Pub/Sub, and Cloud Storage
  • Implement scalable ELT workflows using Dataform and modular SQLX transformations
  • Optimize BigQuery workloads through advanced partitioning, clustering, and materialized views
  • Lead architectural reviews, platform standardization, and stakeholder engagements across engineering and business teams
  • Implement data governance frameworks leveraging tools like Atlan, Collibra, and Dataplex
  • Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
  • Enable downstream consumption through Power BI, Looker, and optimized data marts
  • Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
  • Manage a distributed team of data engineers; set standards, review code, and ensure platform stability
Requirements
  • 10+ years of experience in data architecture and engineering
  • 4+ years of hands‑on GCP experience, including BigQuery, Dataflow, Cloud Composer, Dataform, and Cloud Storage
  • Deep understanding of streaming + batch data patterns, event‑driven ingestion, and modern warehouse design
  • Proven leadership of cross‑functional, distributed teams in client‑facing roles
  • Strong programming skills in Python and SQL
  • Experience working with data catalog tools (Atlan, Collibra), Dataplex, and enterprise source connectors
  • Excellent communication and stakeholder management skills
Preferred Qualifications
  • GCP Professional Data Engineer or Cloud Architect certification
  • Experience with Vertex AI Model Registry, Feature Store, or ML pipeline integration
  • Familiarity with AlloyDB, Cloud Spanner, Firestore, and enterprise integration tools (e.g., Salesforce, SAP, Oracle)
  • Background in legacy platform migration (Oracle, Azure, SQL Server)
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.