Enable job alerts via email!

Senior Data Engineer

Systematix group

Ottawa

On-site

CAD 80,000 - 110,000

Full time

2 days ago
Be an early applicant

Job summary

A technology solutions provider in Ottawa is seeking a Senior Data Engineer to lead project delivery in data and analytics. Responsibilities include designing and building data pipelines, collaborating with stakeholders, and ensuring high-quality data solutions. The ideal candidate has extensive experience in cloud-native technologies and automated delivery processes, with a strong focus on Azure tools and programming languages like Python and Scala.

Qualifications

  • Experience leading, designing, and implementing data solutions.
  • Expertise in building robust data pipelines for data consumption.

Responsibilities

  • Lead delivery of data on the analytics platform.
  • Build and implement batch and real-time data pipelines.
  • Collaborate with stakeholders to refine data requirements.
  • Increase quality and speed of data onboarding.

Skills

Programming experience in Spark using modern languages such as Python
Experience with Azure Data Lake Storage
Involvement in Data Engineering principles
Integration patterns such as Azure Event Hub
Database modeling techniques
Streaming data architecture experience
SQL Server and Oracle experience
Source code management environments

Tools

Azure Databricks
Azure Synapse
Cassandra
MongoDB
Azure DevOps
Job description

We are Systematix and we are looking for aSenior Data Engineerfor an upcoming opportunity with a public sector client. This contract is expected to start in October and last until September 2026. The ideal candidate must possess a Reliability level security clearance.

About the Project

The Senior Data Engineer Service Provider will be responsible for leading the delivery of data on data and analytics platform. These services are at the forefront of building out data engineering practice cloud-native technologies. The Senior Data Engineer Service Provider must have experience in leading, designing, implementing, and collaborating with stakeholders to achieve the best results for our clients.

About the Responsibilities
  • Proven design, build and implementation of batch and real-time data pipelines. Driven by automated repeatable delivery of data that aligns to enterprise data governance standards.
  • Experience in developing and proposing design patterns that conform to requirements. Responsible to ensure the proposed design, optimally addresses access and query patterns; data consumption and adheres to internal architecture standards.
  • Experienced collaboration working with various stakeholders across the business, data scientists and IT. Working closely building relationships, refining data requirements to meet various data and analytics initiatives and data consumption requirements.
  • Increase the overall speed in which data is onboarded to the Data and Analytics platform.
  • Building robust data pipelines to enable larger data consumption on the Data and Analytics Platform
  • Increase the overall quality of data pipeline development through DevSecOps.
About the Qualifications
  • Programming experience in Spark using modern languages such as Python, Scala
  • Experience working with modern data architectures like Azure Data Lake Storage, Azure Databricks, Azure Synapse (formerly SQL Data Warehouse) and Delta Lakes
  • Experience leading Data Engineering principles within an organization/team.
  • Experience working with Integration patterns and technologies such as Azure Event Hub, Function App and C#
  • Knowledge and expertise of database modeling techniques: Data Vaults, Star, Snowflake, 3NF, etc.
  • Experience working with streaming data architecture and technologies for real-time: Spark Streaming, Kafka, Flink, Storm.
  • Experience working with relational and non-relational database technologies: SQL Server, Oracle, Cassandra, MongoDB, CosmosDB, HBase .
  • Experience working with source code and configuration management environments such as Azure DevOps, Git, Maven, Nexus.

Assets:

  • Experience within Azure environment
  • Strong Python, Scala and Spark experience
  • Experience modernizing data platforms.
  • Experience with Azure Functions and C#.
  • 2 to 3 projects developing Data Vault.

Candidates must outline in detail how they meet the above requirements.

At Systematix, our core values—excellence, collaboration, respect, and knowledge as a pursuit—underpin our commitment to fostering an inclusive and equitable environment. We encourage everyone to be their authentic selves, and we are committed to ensuring that our employment decisions are entirely based on job requirements and individual qualifications. We welcome applications from qualified candidates of all backgrounds, including but not limited to race, ethnicity, gender identity or expression, sexual orientation, disability, age, and religious beliefs. If our values and the position advertised resonate with you, we encourage you to apply.

Apply

Mr.

Mrs.

Other

* First Name

* Last Name

* Email Address

* Attach your CV (.doc, .docx, .rtf, or .pdf)

* Type of position desired Permanent Contract

Salary Expected

Notes

I have reviewed the privacy policy and agree to my personal information from my application being provided to Systematix.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.