Job Search and Career Advice Platform

Enable job alerts via email!

GCP Data Architect - Remote

Software International

Remote

CAD 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A premier tech consulting firm is seeking a skilled GCP Data Architect for a remote contract role in Toronto. This position offers an opportunity to create and oversee data solutions utilizing Google Cloud Platform. The ideal candidate will have strong SAP data integration expertise, experience with data governance, and the ability to design complex data models. This is a 6-month contract with a specified hourly rate ranging from $110 to $140 CAD depending on experience.

Qualifications

  • 5+ years of experience with Google Cloud Platform.
  • Strong expertise in SAP data extraction and modeling.
  • Experience in designing data governance frameworks.

Responsibilities

  • Define and implement enterprise-wide data strategy.
  • Design data models supporting analytics and operational workloads.
  • Lead on-premise to cloud migrations for enterprise data platforms.

Skills

Google Cloud Platform (GCP)
SAP data integration
Data strategy
SQL programming
Python programming

Tools

GCP BigQuery
Apache Airflow
Boomi
Job description
About the job GCP Data Architect - Remote

Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and mid‑sized organizations in Canada/US and Europe.

We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working at remotely This is a 6 month contract initially, but could be extended.

Role: GCP Data Architect

Type: Contract

Duration: 6 months to start + potential extension

Location: Toronto, ON - remote with occasional office visits

Rate: $110 -$140 CDN/hr C2C depending on overall experience

GCP Data Architect - Role Overview

We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprise‑grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands‑on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost‑effective data ecosystem that bridges SAP and GCP environments.

Key Responsibilities
1. Data Strategy, Security & Governance
  • Define and implement enterprise‑wide data strategy aligned with business goals.
  • Establish data governance frameworks, data classification, retention, and privacy policies.
  • Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA).
  • Design conceptual, logical, and physical data models to support analytics and operational workloads.
  • Implement star, snowflake, and data vault models for analytical systems.
  • Implement S4 CDS views in Google Big Query
  • Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
  • Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and Dataflow
  • Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.
  • Leverage integration tools such as Boomi for system interoperability.
5. Programming & Analytics
  • Develop complex SQL queries for analytics, transformations, and performance tuning.
  • Build automation scripts and utilities in Python.
  • Good understanding of CDS views, ABAP language
6. System Migration
  • Lead on‑premise to cloud migrations for enterprise data platforms [SAP BW/Bobj]
  • Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
8. DevOps for Data
  • Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
  • Apply infrastructure‑as‑code principles for reproducible and scalable deployments.
Preferred Skills
  • Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow.
  • Strong SQL and Python programming skills.
  • Hands‑on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
  • Knowledge of data governance frameworks and data security best practices.
  • Experience with Boomi, Informatica, or MuleSoft for SAP and non‑SAP integrations.
  • Experience in Google Cortex Framework for SAP‑GCP integrations.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.