
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A leading grant foundation in Toronto is seeking a candidate to manage and optimize business intelligence systems in Microsoft Azure. This role requires expertise in distributed systems, data governance, and experience with SQL and Python. You will collaborate with various teams to enhance data solutions that align with organizational goals. Competitive package offered.
Recognized as one of Canada’s Most Admired Corporate Cultures, and a leading grant foundation, the Ontario Trillium Foundation (OTF) is an agency of the Government of Ontario with a mandate to build healthy and vibrant communities. With a budget of over $100 million, the Ontario Trillium Foundation (OTF) awards grants to some 700 projects every year to build healthy and vibrant Ontariocommunities.
This is a business critical role part of the Measurement, Evaluation, and Business Intelligence team and works in collaboration with business units to ensure OTF’s BI systems are optimized to support the enterprise BI and analytics strategy. The main purpose of this role is to manage and optimize the infrastructure of Ontario Trillium Foundation’s (OTF) business intelligence system in Microsoft Azure. Data Engineering, subject matter expertise in distributed systems and infrastructure management are the core functions of this role.
This role is responsible for leading the design, implementation and management of enterprise data solutions on Microsoft Azure platform. You will work with the Measurement Evaluation and Business Intelligence team to architect data infrastructure that impact key business decisions at the Foundation. Solid experience and understanding of considerations for operationalization of data warehouses, data lakes, and Business Intelligence platforms in a cloud environment is a must. We are looking for candidates who have direct experience developing and managing distributed database solutions in Microsoft Azure and ability to work collaboratively with business stakeholders.
Manage and deploy OTF’s enterprise data infrastructure for analytics and BI systems
Develop, test, and maintain OTF’s data warehouse architecture in Microsoft Azure – experience with Databricks is preferred
Manage a trusted and reliable set of data sources for analytics applications and BI dashboards
Oversee ETL development, including the design, implementation, and scaling of new data pipelines, transformation of raw data into data models using Databricks or other tools
Optimize existing infrastructure and pipelines, including troubleshooting and solving production problems
Oversee CI/CD processes and source control management using Azure DevOpsIdentify ways to improve data quality in collaboration with MEBI team members
Inform data governance strategy, providing feedback on data issues and vulnerabilities, and acting as an advocate and driver of OTF data policies and principles
Ensure data security, privacy, and compliance with relevant regulations and organizational policies (e.g. FIPPA, Privacy Policy, Information Management Policy, Information Classification Standard, Records Retention Schedules, etc.)
Act as primary liaison between IT, business units, and MEBI teams to ensure alignment of data solutions with organizational goals
Lead the operationalization of BI strategy including data storage, management, and access, in collaboration with Measurement, Evaluation and Business Intelligence department team members
Support the OTF’s annual BI plan with general guidance to the Director, MEBI, and carries out approved BI initiatives and projects
Lead the development and implementation of BI solutions from a people, process and technology perspective, including an integrated strategy for upskilling team members, operational process optimization, and systems enhancements
In collaboration with other MEBI team members and the Director, participate in the development and execution of analytics roadmap strategies, business cases, RFP/RFI submissions, cost-benefit analysis and benefits realization deliverables.
Informs the BI budget with general guidance to the Director MEBI and in collaboration with the Director, IT.
Strong background in distributed data systems and lakehouse architectures as they relate to BI concepts, methods, tools and technologies
Experience with Microsofts’ Azure environment (ADLS, Logic Apps, Function Apps, Azure Data Factory ,) - Azure Databricks experience with PySpark & SQL is preferred
Experience with programming in SQL, Python and/or others (i.e. R, Spark, PowerShell, Bash) to support data modeling, cleaning, integration, and cloud automation.
Experience designing and managing enterprise-wide cloud data and analytics infrastructure, including lakehouse environments, data modeling, ETL pipelines, and analytic tools
Experience in designing data systems from initial architecture to implementation and optimization.
Demonstrated experience designing, implementing, and maintaining CI/CD pipelines and Git repositories Project management skills including the ability to effectively manage and prioritize the implementation of multiple projects in the data systems, with competing timelines and deliverables.
Knowledge of Data Governance tools and workstreams, including data catalogues, metadata, data quality and/or others.
Ability to translate business needs into BI solutions for diverse stakeholders
Ability to stay up to date on new and existing programs and software tools that may be relevant to OTF’s BI needs
Willingness and ability to travel occasionally, primarily within Ontario.
Bilingualism is an asset.
A team player with a sense of humour.
Pursuit of Excellence/Accountability: Take ownership and delivers
Agility: Embrace change and ambiguity
Collaboration: Working together with others to leverage skills, talents, and knowledge
Critical Thinking: Interpret, evaluate, and analyze facts and information
Inclusion: Embrace all people irrespective of race, gender identity, disability
Minimum 1 year direct experience in Enterprise Data Warehouse technologies and distributed systems
Minimum 1 year hands on experience developing data warehousing, data lakes, batch processing, and extract transform load (ETL) workflow solutions
Minimum 1 year direct experience designing and optimizing SQL database architecture
Expertise in developing software code in one or more languages (SQL, Python, Java, Spark).
Experience and demonstrated success in project management.
Familiarity with PowerBI and Sharepoint
Experience with Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Databricks SQL Warehouses, Databricks Asset Bundles, etc.)
Experience with Azure DevOps or other Git version control systems
Experience working with third-party APIs to ingest data for analytics purposes.
Bachelor’s degree in computer science or related degree or equivalent work experience
We invite applications from people who reflect the diverse communities we serve, including Indigenous (First Nation, Métis and Inuit) and Black peoples.
The use of AI is not used to screen, assess, or select applicants
To apply for this position, submit your cover letter and resume online.