Enable job alerts via email!

GCP Data Architect - Remote

Software International

Toronto

Remote

CAD 80,000 - 100,000

Part time

23 days ago

Job summary

A leading tech consulting firm is seeking a GCP Data Architect to design and implement data solutions in a remote capacity. The ideal candidate will have extensive experience with Google Cloud Platform and SAP data integration. This is a contract role for an initial 6 months, with a competitive hourly rate dependent on experience, up to $140 CAD. Candidates with strong programming skills in SQL and Python are encouraged to apply.

Qualifications

  • 5+ years of experience as a data architect or similar role.
  • Strong understanding of Google Cloud Platform services.
  • Experience with SAP data extraction and integration.

Responsibilities

  • Define and implement enterprise-wide data strategy.
  • Design conceptual, logical, and physical data models.
  • Architect data solutions on GCP using BigQuery.

Skills

SAP data integration expertise
Cloud data platforms
Data governance
Security
Data modeling
ETL/ELT pipelines
SQL
Python programming

Tools

GCP BigQuery
Apache Airflow
Boomi
SAP SLT
Terraform
Job description
Overview

Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid-sized organizations in Canada/US and Europe.

We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working remotely. This is a 6-month contract initially, but could be extended.

Role: GCP Data Architect

Type: Contract

Duration: 6 months to start + potential extension

Location: Toronto, ON - remote with occasional office visits

Rate: $110 - $140 CDN/hr C2C depending on overall experience

GCP Data Architect - Role Overview

We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.

Responsibilities
1. Data Strategy, Security & Governance
  • Define and implement enterprise-wide data strategy aligned with business goals.
  • Establish data governance frameworks, data classification, retention, and privacy policies.
  • Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
2. Data Architecture & Modeling
  • Design conceptual, logical, and physical data models to support analytics and operational workloads.
  • Implement star, snowflake, and data vault models for analytical systems.
  • Implement S4 CDS views in Google BigQuery
3. Google Cloud Platform Expertise
  • Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
  • Implement cost optimization strategies for GCP workloads.
4. Data Pipelines & Integration
  • Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
  • Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
  • Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
  • Develop complex SQL queries for analytics, transformations, and performance tuning.
  • Build automation scripts and utilities in Python.
  • Good understanding of CDS views, ABAP language
6. System Migration
  • Lead on-premise to cloud migrations for enterprise data platforms [SAP BW/BOBJ]
  • Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
7. DevOps for Data
  • Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
  • Apply infrastructure-as-code principles for reproducible and scalable deployments.
Preferred Skills
  • Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow.
  • Strong SQL and Python programming skills.
  • Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
  • Knowledge of data governance frameworks and data security best practices.
  • Experience with Boomi, Informatica, or MuleSoft for SAP and non-SAP integrations.
  • Experience in Google Cortex Framework for SAP-GCP integrations.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.