Enable job alerts via email!

Data Architect / Modeller - Senior

Lancesoft

Toronto

On-site

CAD 90,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A data solutions company in Toronto seeks a skilled Data Architect to develop and implement data architecture. The ideal candidate will have proficiency in SQL and Python, hands-on experience with Databricks and Spark SQL, and will be responsible for data modeling and quality assurance. This position offers opportunities for growth in a collaborative and agile environment.

Qualifications

  • Experience in design, development and implementation of data models for analytics.
  • Knowledgeable in BI modelling methodologies such as Inmon and Kimball.
  • Strong understanding of automated data quality checks using Python and SQL.

Responsibilities

  • Develop and implement data architecture for application development.
  • Create detailed data standards to enable integration with systems.
  • Monitor milestones and submission of status reports.

Skills

Proficiency in SQL
Python
Data modeling
Data transformation using Databricks
Data quality principles
Agile methodologies

Tools

Databricks
Spark SQL
ERwin
Visio
PowerDesigner
Git
Job description
Overview

Location: Toronto, ON

Description

Responsibilities Develops and implements the data architecture for application development in a complex and distributed environment, including the determination of the flow and distribution of data, the location of databases, and data access methods.

Responsibilities
  • Develops and implements the data architecture for application development in a complex and distributed environment, including the determination of the flow and distribution of data, the location of databases, and data access methods.
  • Provides a standard common business vocabulary, expresses strategic data requirements, outlines high level integrated designs to meet these requirements, and aligns with the enterprise strategy and related business architecture.
  • Define conceptual, logical model and physical model mapping from data source to curated model and data mart.
  • Design dimensional data mart models, create source-to-target-mapping documentation, design and document data transformation from curated model to data mart.
  • The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
  • Development of documentation and materials as part of a review and knowledge transfer to other members.
  • Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed.
General Skills

Technical Experience (30%)

  • Proficiency in SQL and python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
  • Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols.
  • Knowledge of performance considerations for different database designs in different environments.
  • Knowledge and experience in information resource management tools and techniques.

Data Architecture & Modeling (50%)

  • Experience in design, development and implementation of data models for analytics and business Intelligence.
  • Knowledgeable in BI modelling methodologies (Inmon, Kimball, data vault), data mapping, data warehouse, data lake and data lakehouse for enterprise.
  • Strong understanding of data quality principles, with the ability to design and implement automated data quality checks using tools such as Python and SQL, ensuring data integrity across pipelines and models.
  • Experience in structured methodologies for the design, development and implementation of applications.
  • Experience in systems analysis and design in large or medium systems environments.
  • Experience in the use of data modelling methods and tools (e.g. ERwin, Visio, PowerDesigner) including a working knowledge of metadata structures, repository functions, and data dictionaries.
  • Experience in monitoring and enforcing data modelling / normalization standards.
  • Experience in developing enterprise architecture deliverables (e.g. Models).

Agile Product Development (20%)

  • Experience working in an agile, sprint-based development environment.
  • Understanding and working knowledge of iterative product development cycles (Discovery, Agile, Beta, Live).
  • Experience collaborating and sharing tasks with multiple developers on complex data product deliveries.
  • Experience contributing to version-controlled, shared codebases using git (Azure DevOps, GitHub, Bitbucket) and participating in pull request code reviews.
Desirable Skills
  • Experience with middleware and gateways.
  • Experience in designing / developing an automated data distribution mechanism.
  • Knowledge and understanding of object-oriented analysis and design techniques.
  • Experience in developing enterprise architecture deliverables (e.g. Models) based on Ontario Government Enterprise Architecture processes and practice.
  • Knowledge and understanding of Information Management principles, concepts, policies and practices.
  • Experience creating detailed data standards to enable integration with other systems.
  • Experience reviewing conceptual, logical and physical data models for quality and adherence to standards.
  • Knowledge and understanding of dimensional and relational data models.
  • Knowledge and experience in information resource management tools and techniques.
Must Have
  • Proficiency in SQL and python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
  • Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols.
  • Experience in design, development and implementation of data models for analytics and business Intelligence.
  • Knowledgeable in BI modelling methodologies (Inmon, Kimball, data vault), data mapping, data warehouse, data lake and data lakehouse for enterprise.
  • Strong understanding of data quality principles, with the ability to design and implement automated data quality checks using tools such as Python and SQL, ensuring data integrity across pipelines and models.
  • Experience with middleware and gateways.
  • Experience in designing / developing an automated data distribution mechanism.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.