Enable job alerts via email!

Data Architect/Modeller - Senior

LanceSoft Inc

Toronto

On-site

CAD 100,000 - 130,000

Full time

Today
Be an early applicant

Job summary

A leading data solutions firm in Toronto seeks a Data Architect/Modeller to develop data architecture and oversee data models for analytics. The ideal candidate has strong proficiency in SQL and Python, experience with Databricks and Spark SQL, and a solid understanding of BI modelling methodologies. This role offers a chance to work in a dynamic, agile environment.

Qualifications

  • Proficiency in SQL and Python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
  • Experience with multiple platforms, operating systems, environments, and database technologies.
  • Knowledge of data quality principles, with the ability to design and implement automated checks.

Responsibilities

  • Develops and implements the data architecture for application development.
  • Define conceptual, logical model and physical model mapping.
  • Experience in monitoring and enforcing data modelling/normalization standards.

Skills

Proficiency in SQL
Proficiency in Python
Experience with Databricks
Experience with Spark SQL
Knowledge of BI modelling methodologies
Experience in design, development and implementation of data models
Experience in agile development

Tools

ERWIN
VISIO
PowerDesigner
Job description
Overview

Data Architect/Modeller – Toronto, ON

Responsibilities
  • Develops and implements the data architecture for application development in a complex and distributed environment, including the determination of the flow and distribution of data, the location of databases, and data access methods.
  • Provides a standard common business vocabulary, expresses strategic data requirements, outlines high level integrated designs to meet these requirements, and aligns with the enterprise strategy and related business architecture.
  • Define conceptual, logical model and physical model mapping from data source to curated model and data mart.
  • Design dimensional data mart models, create source-to-target-mapping documentation, design and document data transformation from curated model to data mart.
  • The Architect / Modeler must have previous work experience in conducting Knowledge Transfer and training sessions, ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using review-watch-do methodology & demonstrate the ability to prepare and present.
  • Development of documentation and materials as part of a review and knowledge transfer to other members.
  • Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed.
General Skills
  • Technical Experience (30%)
    • Proficiency in SQL and Python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
    • Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols.
    • Knowledge of performance considerations for different database designs in different environments.
    • Knowledge and experience in information resource management tools and techniques.
Data Architecture & Modeling (50%)
  • Experience in design, development and implementation of data models for analytics and business Intelligence.
  • Knowledgeable in BI modelling methodologies (Inmon, Kimball, data vault), data mapping, data warehouse, data lake and data lakehouse for enterprise.
  • Strong understanding of data quality principles, with the ability to design and implement automated data quality checks using tools such as Python and SQL, ensuring data integrity across pipelines and models.
  • Experience in structured methodologies for the design, development and implementation of applications.
  • Experience in systems analysis and design in large or medium systems environments.
  • Experience in the use of data modelling methods and tools (e.g. ERWIN, VISIO, PowerDesigner) including a working knowledge of metadata structures, repository functions, and data dictionaries.
  • Experience in monitoring and enforcing data modelling/normalization standards.
  • Experience in developing enterprise architecture deliverables (e.g. models).
Agile Product Development (20%)
  • Experience working in an agile, sprint-based development environment.
  • Understanding and working knowledge of iterative product development cycles (Discovery, Agile, Beta, Live).
  • Experience collaborating and sharing tasks with multiple developers on complex data product deliveries.
  • Experience contributing to version-controlled, shared codebases using git (Azure DevOps, GitHub, Bitbucket) and participating in pull request code reviews.
Desirable Skills
  • Experience with middleware and gateways.
  • Experience in designing/developing an automated data distribution mechanism.
  • Knowledge and understanding of object-oriented analysis and design techniques.
  • Experience in developing enterprise architecture deliverables (e.g. models) based on Ontario Government Enterprise Architecture processes and practice.
  • Knowledge and understanding of Information Management principles, concepts, policies and practices.
  • Experience creating detailed data standards to enable integration with other systems.
  • Experience reviewing conceptual, logical and physical data models for quality and adherence to standards.
  • Knowledge and understanding of dimensional and relational data models.
  • Knowledge and experience in information resource management tools and techniques.
Must Have
  • Proficiency in SQL and Python, with hands-on experience using Databricks and Spark SQL for data modeling and transformation tasks.
  • Experience with at least two different platforms, operating systems, environments, database technologies, languages and communications protocols.
  • Experience in design, development and implementation of data models for analytics and business Intelligence.
  • Knowledgeable in BI modelling methodologies (Inmon, Kimball, data vault), data mapping, data warehouse, data lake and data lakehouse for enterprise.
  • Strong understanding of data quality principles, with the ability to design and implement automated data quality checks using tools such as Python and SQL, ensuring data integrity across pipelines and models.
  • Experience with middleware and gateways.
  • Experience in designing/developing an automated data distribution mechanism.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.