Overview
Project: Data Management Platform Projects
Client: Government of Alberta Overview
The Government of Alberta is modernizing its legacy systems by migrating to a cloud-native Azure Data Management Platform complemented by on-premises geospatial systems.
We are seeking a Data Architect to design implement and manage scalable secure and integrated data solutions supporting ministries such as Environment and Protected Areas Transportation and Economic Corridors and Service Alberta.
The role focuses on enabling seamless data ingestion transformation and integration using Azure Data Factory Synapse Analytics Data Lake Storage Databricks and Purview while ensuring strong governance compliance and scalability.
Key Responsibilities
- Design and implement scalable secure high-performance data architectures in Microsoft Azure (cloud-native and hybrid environments).
- Lead the development of data ingestion transformation and integration pipelines using Azure Data Factory Databricks and Synapse Analytics.
- Architect and manage data lakes and structured storage (Azure Data Lake Storage Gen2) with governance and efficient access.
- Integrate data from diverse systems (e.g. ServiceNow ERP and geospatial tools) using APIs connectors and custom scripts.
- Develop and maintain data models and semantic layers to support analytics reporting and machine learning.
- Build and optimize data workflows in Python and SQL for data cleansing enrichment and analytics within Azure Databricks.
- Design and expose secure APIs and data services using Azure API Management for downstream systems.
- Implement and oversee data governance - metadata management classification and lineage tracking.
- Ensure compliance with FOIP GDPR and other privacy regulations through access control encryption and data masking.
- Collaborate with cross-functional teams to align architecture with business goals and modernization strategies.
- Monitor and troubleshoot data pipelines to maintain performance scalability and reliability.
- Provide technical leadership and mentorship to data engineers and analysts.
- Perform other related duties as required.
Education & Qualifications
Requirements
- Education: College diploma or bachelors degree in Computer Science or a related field.
Experience & Skills
- Must-Have Experience
- Databricks Platform Administration & Optimization — 3 years
- Enterprise-Wide Data Architecture & Strategic Alignment — 8 years
- Designing Analytics-Ready Data Platforms — 4 years
- Version Control Systems — 4 years
- Azure Infrastructure Services & Authentication — 5 years
- Python and SQL for Data Engineering — 6 years
- Azure Databricks & Delta Lake — 3 years
- Nice-to-Have Experience / Certifications
- TOGAF Certification — 1 year
- AI-driven Code Generation & Automation — 1 year
- Business Requirement Analysis (Data Context) — 8 years
- Microsoft SQL Server (Advanced) — 8 years
- ETL Pipeline Development — 2 years
- Data Governance Security & Metadata in Databricks — 2 years
- RESTful API Design & Integration — 3 years
- Message Queueing (Azure Service Bus etc.) — 3 years
- Cross-Functional Team Collaboration — 5 years
- ServiceNow Azure Data Management — 1 year
Work Details
Location: Primarily remote (within Canada).
On-site meetings (up to 3-4 times per month) in Edmonton Alberta as required.
Submission Requirements
- Resume (must include relevant experience under each job / project with MMM / YYYY MMM / YYYY format).
- Three professional references (most recent first).
- Reference checks may be used for scoring.
Additional Details
- Key Skills: Fund Management, Drafting, End User Support, Infrastructure, Airlines, Catia
- Employment Type: Full Time
- Experience: years
- Vacancy: 1