Enable job alerts via email!

Data Architect Senior

Teckhorizon Inc

Edmonton

Hybrid

CAD 100,000 - 130,000

Full time

Today
Be an early applicant

Job summary

A technology consulting firm is seeking a Data Architect to design, implement and manage data solutions for the Government of Alberta. This role involves using Azure technologies to ensure seamless data integration and governance. Candidates should have strong experience in Databricks, Azure Data Factory, and strategic data architecture. The position is primarily remote in Canada with occasional on-site meetings in Edmonton.

Qualifications

  • Strong experience with Databricks Platform Administration and Optimization.
  • 8+ years in Enterprise-Wide Data Architecture and Strategic Alignment.
  • Experience with Azure Infrastructure Services and Authentication is essential.

Responsibilities

  • Design and implement data architectures in Microsoft Azure.
  • Lead the development of data integration pipelines using Azure tools.
  • Implement data governance and compliance measures.

Skills

Databricks Platform Administration & Optimization
Azure Infrastructure Services & Authentication
Python and SQL for Data Engineering
Enterprise-Wide Data Architecture & Strategic Alignment
Designing Analytics-Ready Data Platforms

Education

College diploma or bachelor's degree in Computer Science or a related field

Tools

Azure Data Factory
Azure Synapse Analytics
Databricks
Job description
Overview

Project: Data Management Platform Projects

Client: Government of Alberta Overview

The Government of Alberta is modernizing its legacy systems by migrating to a cloud-native Azure Data Management Platform complemented by on-premises geospatial systems.

We are seeking a Data Architect to design implement and manage scalable secure and integrated data solutions supporting ministries such as Environment and Protected Areas Transportation and Economic Corridors and Service Alberta.

The role focuses on enabling seamless data ingestion transformation and integration using Azure Data Factory Synapse Analytics Data Lake Storage Databricks and Purview while ensuring strong governance compliance and scalability.

Key Responsibilities
  • Design and implement scalable secure high-performance data architectures in Microsoft Azure (cloud-native and hybrid environments).
  • Lead the development of data ingestion transformation and integration pipelines using Azure Data Factory Databricks and Synapse Analytics.
  • Architect and manage data lakes and structured storage (Azure Data Lake Storage Gen2) with governance and efficient access.
  • Integrate data from diverse systems (e.g. ServiceNow ERP and geospatial tools) using APIs connectors and custom scripts.
  • Develop and maintain data models and semantic layers to support analytics reporting and machine learning.
  • Build and optimize data workflows in Python and SQL for data cleansing enrichment and analytics within Azure Databricks.
  • Design and expose secure APIs and data services using Azure API Management for downstream systems.
  • Implement and oversee data governance - metadata management classification and lineage tracking.
  • Ensure compliance with FOIP GDPR and other privacy regulations through access control encryption and data masking.
  • Collaborate with cross-functional teams to align architecture with business goals and modernization strategies.
  • Monitor and troubleshoot data pipelines to maintain performance scalability and reliability.
  • Provide technical leadership and mentorship to data engineers and analysts.
  • Perform other related duties as required.
Education & Qualifications

Requirements

  • Education: College diploma or bachelors degree in Computer Science or a related field.
Experience & Skills
  • Must-Have Experience
  • Databricks Platform Administration & Optimization — 3 years
  • Enterprise-Wide Data Architecture & Strategic Alignment — 8 years
  • Designing Analytics-Ready Data Platforms — 4 years
  • Version Control Systems — 4 years
  • Azure Infrastructure Services & Authentication — 5 years
  • Python and SQL for Data Engineering — 6 years
  • Azure Databricks & Delta Lake — 3 years
  • Nice-to-Have Experience / Certifications
  • TOGAF Certification — 1 year
  • AI-driven Code Generation & Automation — 1 year
  • Business Requirement Analysis (Data Context) — 8 years
  • Microsoft SQL Server (Advanced) — 8 years
  • ETL Pipeline Development — 2 years
  • Data Governance Security & Metadata in Databricks — 2 years
  • RESTful API Design & Integration — 3 years
  • Message Queueing (Azure Service Bus etc.) — 3 years
  • Cross-Functional Team Collaboration — 5 years
  • ServiceNow Azure Data Management — 1 year
Work Details

Location: Primarily remote (within Canada).

On-site meetings (up to 3-4 times per month) in Edmonton Alberta as required.

Submission Requirements
  • Resume (must include relevant experience under each job / project with MMM / YYYY MMM / YYYY format).
  • Three professional references (most recent first).
  • Reference checks may be used for scoring.
Additional Details
  • Key Skills: Fund Management, Drafting, End User Support, Infrastructure, Airlines, Catia
  • Employment Type: Full Time
  • Experience: years
  • Vacancy: 1
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.