Riyadh
On-site
SAR 80,000 - 120,000
Full time
20 days ago
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Job summary
An innovative firm is seeking a skilled GCP Cloud Architect with over a decade of experience in cloud services and data lakes. This role involves architecting robust data lake solutions, leveraging Google Cloud technologies to ensure scalability, security, and performance. You'll be responsible for designing data ingestion patterns, implementing governance, and utilizing tools like Terraform for infrastructure automation. Join a forward-thinking team that values collaboration and technical excellence, and make a significant impact on cloud architecture and data management strategies.
Qualifications
- 10+ years of experience in cloud architecture, particularly with GCP.
- Deep knowledge of GCP services and data lake principles.
Responsibilities
- Architect scalable data lake architectures using GCP services.
- Implement policies for data access and quality within the data lake.
Skills
Google Cloud Platform (GCP)
Kubernetes Engine (GKE)
Cloud Functions
Cloud Run
IAM, Roles, Permissions
Cloud Monitoring
Cloud Logging
Data Lakes
Data Transformation
Infrastructure as Code (Terraform)
Tools
Terraform
Cloud Deployment Manager
BigQuery
Cloud Dataflow
Cloud Dataproc
Pub/Sub
Overview
GCP Cloud Architect
Responsibilities
- Deep knowledge of Google Cloud Platform (GCP) services:
- Compute Engine, Kubernetes Engine (GKE), Cloud Functions, Cloud Run.
- Cloud Storage (object storage, lifecycle management), Persistent Disk.
- VPC, Firewall Rules, Load Balancing, Cloud DNS.
- IAM, Roles, Permissions, Service Accounts.
- Cloud Monitoring, Cloud Logging.
- Understanding GCP pricing models and cost optimization strategies.
- Architectural Design Principles like Scalability, reliability, security, performance, cost-effectiveness.
- Hybrid and Multi-Cloud Architectures like Understanding how GCP integrates with on-premises and other cloud environments.
- Understanding the principles, benefits, and challenges of data lakes.
- Implementing policies for data access, quality, and compliance within the data lake.
- Designing and implementing systems for cataloging and managing metadata within the data lake using Dataplex.
- Data Ingestion Patterns like Batch and streaming data ingestion strategies.
- Data Transformation and Processing frameworks suitable for data lakes.
- Enabling Data Consumption and Access for various users and applications to access data in the lake.
- Specific GCP Data Lake Services:
- Google Cloud Storage Expertise in designing the storage layer for the data lake, including bucket organization, storage classes, and lifecycle policies.
- BigQuery expertise for querying and analyzing large datasets within the data lake.
- Designing and implementing data lakes with integrated governance, discovery, and security using DataPlex.
- Cloud Dataflow Architecting scalable data processing pipelines for ETL/ELT within the data lake.
- Cloud Dataproc Expertise Understanding how to leverage managed for data processing.
- Pub/Sub & Dataflow for Streaming Designing real-time data ingestion and processing pipelines.
- Infrastructure as Code Proficiency in tools like Terraform or Cloud Deployment Manager for automating infrastructure provisioning.
- Security Best Practices for GCP Implementing security controls for data at rest and in transit.
- Disaster Recovery and Business Continuity Designing resilient data lake architectures.
- Communication and Collaboration Ability to communicate technical concepts to both technical and non-technical stakeholders.
#LI-HJ1
Qualifications
Years of Experience: 10+