Enable job alerts via email!

Senior Data Architect

CyberSolve IT Inc.

Vaughan

Hybrid

CAD 90,000 - 120,000

Full time

Yesterday
Be an early applicant

Job summary

A leading IT services firm in York Region seeks a Senior Data Architect to lead the development of next-generation data platforms. Ideal candidates will have 8+ years in data architecture and deep expertise in GCP and Lakehouse architectures. This is a contract-to-hire role emphasizing strategic data initiatives in a healthcare context while fostering collaboration across teams.

Qualifications

  • 8+ years of experience in data architecture, engineering, and data management.
  • 5+ years of GCP experience, including BigQuery, Cloud Storage, and Pub/Sub.
  • Proven experience designing Lakehouse architectures using Delta Lake, Iceberg, or Hudi.

Responsibilities

  • Design and implement scalable data architecture solutions using GCP technologies.
  • Define data architecture best practices and enforce governance.
  • Collaborate with stakeholders for practical data solutions.

Skills

Data architecture
Google Cloud Platform (GCP)
Lakehouse architectures
Data engineering
Python
SQL
MLOps
Problem-solving

Tools

BigQuery
Apache Spark
Airflow
Looker
Power BI
Tableau

Job description

1 day ago Be among the first 25 applicants

Direct message the job poster from CyberSolve IT Inc.

Co-Founder | President, CyberSolve IT Inc.

Position Title : Senior Data Architect (GCP – Lakehouse, AI / ML)

Employment Type : Contract to hire (Client Looking to hire only Visa Independent Candidates GC, Citizen etc)

Start Date : Immediate

About the Role

We are seeking an experienced and highly skilled Data Architect to join our dynamic team and lead the development of next-generation cloud-based data platforms. This role is ideal for a strategic, hands-on technical leader with deep expertise in Google Cloud Platform (GCP), Lakehouse architectures, and data engineering. You will help shape the future of data strategy in a leading healthcare organization focused on data-driven decision-making, operational efficiency, and better patient outcomes.

Key Responsibilities

Architecture & Technical Leadership

  • Design and implement scalable, high-performance, cost-effective data architecture solutions using GCP technologies: BigQuery, Dataflow, Dataproc, Cloud Spanner, Pub/Sub, GCS, Vertex AI.
  • Architect and manage data lakes/warehouses, with strong emphasis on Lakehouse principles and technologies: Delta Lake, Apache Iceberg, Hudi.
  • Lead the development of data ingestion, transformation (ETL / ELT) pipelines across structured and unstructured data sources.

Governance, Standards, and Strategy

  • Define and enforce data architecture best practices, including data governance, security, retention, and compliance.
  • Develop documentation and artifacts to illustrate the data lifecycle, from ingestion through consumption.
  • Provide thought leadership and contribute to enterprise-wide data strategy initiatives.
  • Guide and mentor data engineers and junior architects.

Collaboration & Stakeholder Engagement

  • Work with business stakeholders to translate strategic goals into practical data solutions.
  • Collaborate cross-functionally with software engineers, DevOps, product teams, and analysts to ensure data systems meet end-user needs.
  • Maintain strong communication with data governance, compliance, and security teams.

Required Skills & Experience

  • 8+ years of experience in data architecture, engineering, and data management.
  • 5+ years of GCP experience, including BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc, Cloud Composer.
  • Proven experience designing Lakehouse architectures using Delta Lake, Iceberg, or Hudi.
  • Strong knowledge of schema evolution, data partitioning, indexing, ACID compliance, and distributed file systems.
  • Proficient in Python, SQL, and familiarity with Apache Spark, Airflow, and CI/CD pipelines.
  • Deep understanding of MLOps, real-time data processing, and integrating AI/ML into data workflows.
  • Strong analytical and problem-solving skills with a business mindset.
  • Familiar with BI/AI tools and their integration with modern data platforms (e.g., Looker, Power BI, Tableau, Vertex AI).
  • Hands-on experience with data modeling, metadata management, and data quality frameworks.
  • Experience in Agile/Scrum environments.

Preferred Qualifications

  • Experience in healthcare or regulated data environments.
  • Exposure to FHIR, HL7, or other healthcare data standards.
  • Experience with Apache Beam, Kafka, or other streaming platforms.
  • Familiarity with React, Dash, or front-end tools for visualizing data pipelines (a plus).

Core Competencies

  • Excellent communication and interpersonal skills.
  • Strategic thinking and technology foresight.
  • Strong project management and multitasking capabilities.
  • Ability to work independently and drive outcomes across teams.

Seniority level: Mid-Senior level

Employment type: Contract

Job function: Consulting

Industries: IT Services and IT Consulting

Note: The job posting appears active; no indication of expiration was found.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.