Enable job alerts via email!

Senior Data Engineer with experience working within the investment industry for ETL data integr[...]

S I Systems

Toronto, Ottawa, Calgary, Vancouver, Edmonton

Remote

CAD 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Job summary

A leading technology solutions provider is seeking a Senior Data Engineer to work on ETL data integration for investment data solutions. The ideal candidate has extensive experience in data engineering, proficient in Python and SQL, and familiar with Azure data tools. Responsibilities include developing scalable data models and collaborating with analysts to ensure data accuracy and integrity. This is a contract role with a remote work option within Canada.

Qualifications

  • 3+ years of experience in data engineering.
  • 7+ years in developing ETL processes.
  • 3+ years designing and optimizing database structures for investment data.
  • 5+ years hands-on experience with Python and SQL.
  • 3+ years experience with Azure data tools.

Responsibilities

  • Collaborate with client teams to gather data requirements.
  • Develop and optimize ETL processes for data sourcing.
  • Ensure data accuracy and integrity through analysis and validation.
  • Design database structures for investment data.
  • Implement big data technologies for data management.
  • Utilize Azure technologies for building data pipelines.

Skills

Data engineering
ETL processes
Python
SQL
Azure Data Lake
JIRA
Azure DevOps

Tools

Azure Data Factory
Databricks

Job description

Senior Data Engineer with experience working within the investment industry for ETL data integration and development for an Investment Data Solution

Job Type: Contract

Positions to fill: 1

Start Date: Aug 15, 2025

Job End Date: Jan 13, 2026

Pay Rate: Hourly: Negotiable

Job ID: 147024

Our client is seeking a Senior Data Engineer with experience working within the investment industry for ETL data integration and development for an Investment Data Solution

Location: Remote within Canada

Position Overview: We are seeking a skilled Data Engineer to collaborate with teams of analysts and system architects to deliver robust, scalable data solutions for investment data. The ideal candidate will have a strong background in data engineering, ETL development, database architecture, and a deep understanding of both relational and non-relational databases. Additionally, they will work on building data pipelines, ensuring data accuracy, and supporting analytics tasks across multiple investment-related platforms.

Must-Haves:

  • 3+ years of experience in data engineering
  • 7+ years of in developing ETL processes
  • 3+ years of in designing and optimizing database structures for investment data, with knowledge of big data technologies such as data lakes and warehouses.
  • 5+ years of hands-on experience with Python and SQL programming languages.
  • 3+ years of experience with Azure data tools such as Azure Data Lake, Azure Data Factory, and Databricks (DBT is a plus).
  • 2+ years of experience working in an Agile environment, using DevOps tools like JIRA and Azure DevOps (ADO).

Nice-to-Haves:

  • Industry experience with investment companies
  • Collaborating with data analysts and investment professionals to develop data pipelines and workflows supporting portfolio analysis, risk assessment, and performance attribution.
  • Investment analytics concepts such as portfolio analysis, risk assessment, and performance attribution.
  • Investment analytics, including working with market data feeds, trading platforms, and financial databases.

Responsibilities:

  • Collaborate with client teams, data analysts, and system architects to gather data requirements and design scalable data models, pipelines, and integration processes for investment-related data.
  • Develop and optimize ETL processes to gather and transform data from various sources such as market data providers, trading platforms, and internal systems.
  • Perform data analysis, validation, and cleansing to ensure data accuracy and integrity.
  • Design and optimize database structures tailored for investment data storage and retrieval, including both relational and dimensional data modeling.
  • Implement modern big data technologies, including data lakes and data warehouses, to enhance data storage and management.
  • Work with Python and SQL to develop data workflows and processes.
  • Utilize Azure technologies such as Azure Data Lake, Azure Data Factory, and Databricks to build and maintain data pipelines.
  • Participate in Agile software development using tools such as JIRA and Azure DevOps.
  • Collaborate with investment professionals to support analytics models and risk assessments for portfolio analysis.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.