Aktiviere Job-Benachrichtigungen per E-Mail!

GCP Data Architect

Endava

Berlin

Hybrid

EUR 80.000 - 120.000

Vollzeit

Gestern
Sei unter den ersten Bewerbenden

Erhöhe deine Chancen auf ein Interview

Erstelle einen auf die Position zugeschnittenen Lebenslauf, um deine Erfolgsquote zu erhöhen.

Zusammenfassung

A leading company in the tech industry seeks a GCP Data Architect to design and implement data architectures on Google Cloud. You will guide clients through AI transformations and oversee modernization efforts. This senior role also involves mentoring engineers and leading technical delivery for critical projects. The position is hybrid, allowing remote work two days a week.

Leistungen

Competitive salary package
Company performance bonuses
Career coaching and development
Flexible working hours
Global wellbeing program

Qualifikationen

  • 7 years in data engineering or architecture roles.
  • Strong expertise in Google Cloud services.
  • Fluency in German (C1) and English (C1).

Aufgaben

  • Lead design of secure data architectures on GCP.
  • Mentor junior engineers and oversee technical delivery.
  • Collaborate with stakeholders on data strategy.

Kenntnisse

Data Engineering
Data Architecture
Cloud Services
Stakeholder Engagement
SQL
Python

Ausbildung

Professional Data Engineer Certification
Professional Cloud Architect Certification

Tools

Terraform
BigQuery
Airflow

Jobbeschreibung

As a GCP Data Architect you will be a senior technical leader responsible for designing implementing and overseeing robust data architectures on Google Cloud Platform (GCP). Youll guide customers through complex data and AI transformations shape data strategy and lead technical delivery for critical projects. Your work will span advisory architecture governance and implementation with a strong focus on data platform modernization AI enablement and operational excellence.

This role also includes mentoring engineers working closely with Google teams contributing to strategic pre-sales efforts and helping clients realize business value through cloud-native data and AI solutions.

Location : Germany Travel : Occasional National / International

Key Responsibilities

  • Lead the design and implementation of secure scalable and cost-effective data architectures on GCP.
  • Collaborate with client stakeholders to define enterprise data strategies roadmaps and reference architectures.
  • Architect and implement advanced data pipelines (ETL / ELT) and AI-ready data platforms (e.g. Data Lakes Data Mesh).
  • Guide modernization of legacy systems to cloud-native solutions using tools like BigQuery Cloud SQL and Dataproc.
  • Enable AI / ML capabilities through robust MLOps platforms model management pipelines and data governance strategies.
  • Drive adoption of best practices around CI / CD DataOps and MLOps on GCP.
  • Participate in delivery oversight ensuring technical excellence across engagements.
  • Mentor junior engineers and lead knowledge-sharing initiatives across teams.
  • Support proposal writing solution demos and technical workshops during pre-sales cycles.

Qualifications :

Technical Expertise :

  • 7 years of experience in data engineering architecture or analytics roles.
  • Strong expertise with Google Cloud services : BigQuery Dataflow Pub / Sub Cloud Storage Vertex AI Cloud Composer etc.
  • Deep knowledge of data modeling data warehousing distributed systems and real-time / stream processing.
  • Proficiency in SQL Python and data transformation tools.
  • Experience implementing Data Lakes Lakehouses or Data Mesh architectures.
  • Hands-on experience with orchestration frameworks (e.g. Airflow Composer).
  • Good understanding of ML lifecycle management and MLOps tools.
  • Experience with Infrastructure as Code (Terraform Deployment Manager) and CI / CD for data systems.

Strategic and Leadership Skills :

  • Proven ability to shape data strategy and architectural direction at enterprise level.
  • Strong stakeholder engagement and consulting experience.
  • Ability to present technical concepts to both engineering teams and C-level stakeholders.
  • Experience leading cross-functional teams in hybrid (onshore / offshore) environments.

Certifications (preferred)

  • Professional Data Engineer or Professional Cloud Architect (Google Cloud).
  • Relevant certifications in AI / ML or Kubernetes (e.g. TensorFlow Developer CKAD).

Language knowledge :

  • Fluency in German (C1) and English (C1)

And the willingness to work in hybrid mode two days a week from the Berlin office

Additional Information :

Discover some of the global benefits that empower our people to become the best version of themselves :

  • Finance : Competitive salary package share plan company performance bonuses value-based recognition awards referral bonus;
  • Career Development : Career coaching global career opportunities non-linear career paths internal development programmes for management and technical leadership;
  • Learning Opportunities : Complex projects rotations internal tech communities training certifications coaching online learning platforms subscriptions pass-it-on sessions workshops conferences;
  • Work-Life Balance : Hybrid work and flexible working hours employee assistance programme;
  • Health : Global internal wellbeing programme access to wellbeing apps;
  • Community : Global internal tech communities hobby clubs and interest groups inclusion and diversity programmes events and celebrations.

At Endava were committed to creating an open inclusive and respectful environment where everyone feels safe valued and empowered to be their best. We welcome applications from people of all backgrounds experiences and perspectivesbecause we know that inclusive teams help us deliver smarter more innovative solutions for our customers. Hiring decisions are based on merit skills qualifications and potential. If you need adjustments or support during the recruitment process please let us know.

Remote Work : Employment Type :

Full-time

Key Skills

Experience : years

Vacancy : 1

Hol dir deinen kostenlosen, vertraulichen Lebenslauf-Check.
eine PDF-, DOC-, DOCX-, ODT- oder PAGES-Datei bis zu 5 MB per Drag & Drop ablegen.