Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Intrado Life & Safety, Inc.

Canada

On-site

CAD 185,000 - 200,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading emergency services partner in Canada is seeking a Data Engineer to develop robust data pipelines for internal analytics. The successful candidate will have over 5 years of experience in Data Engineering and be proficient in SQL and Python, particularly in cloud environments. Responsibilities include building and maintaining Azure Data Factory pipelines, ensuring data quality, and troubleshooting any issues. The role comes with a starting salary between $185,000 and $200,000 CAD, plus comprehensive benefits and bonus eligibility.

Benefits

Medical, dental, and vision coverage
Paid time off
Tuition reimbursement
Employee discounts

Qualifications

  • 5+ years of experience in Data Engineering focused on ETL/ELT pipelines.
  • Proficiency in building and maintaining data pipelines in a cloud environment.
  • Strong proficiency in SQL for data analysis.

Responsibilities

  • Build and maintain Azure Data Factory pipelines to ingest data.
  • Write Python code in Databricks to clean raw data.
  • Monitor daily jobs and troubleshoot failures.

Skills

Experience in Data Engineering
Building and optimizing data pipelines
SQL proficiency
Python scripting
Data quality assurance
Familiarity with data schemas and APIs
Experience with LLMs

Education

Bachelor’s degree in Computer Science, Software Engineering or related field

Tools

Azure Data Factory
Databricks
Job description
About Us

Intrado is dedicated to saving lives and protecting communities, helping them prepare for, respond to, and recover from critical events. Our cutting‑edge company strives to become the most trusted, data‑centric emergency services partner by uniting fragmented communications into actionable intelligence for first responders. At Intrado, all of our work truly matters.

Responsibilities / Qualifications

We are seeking an exceptional Data Engineer to build the robust data pipelines that will power our company’s internal business analytics. Working under the guidance of the Staff Data Engineer, you will ensure that the raw data from multiple systems is consistently ingested, cleaned, and made ready for analysis. By building stable and efficient pipelines, you will directly support the timely generation of visualizations that leadership relies on to make informed decisions.

This is a demanding role in a results‑oriented environment with high expectations for agency, speed, and ownership.

Key Responsibilities
  • Pipeline Execution: Build and maintain Azure Data Factory pipelines to ingest data from multiple sources.
  • Silver Layer Transformation: Write the Python code in Databricks to clean raw data and move it into the silver layer, handling deduplication, type casting, and validation.
  • Reliability: Monitor daily jobs and troubleshoot failures. You are the first line of defense in ensuring that pipelines are stable and do not break.
  • Data Quality: Implement automated checks to verify that data arriving in the lake matches the source systems.
Required Qualifications
  • Experience: 5+ years of experience in Data Engineering, specifically focused on building and maintaining ETL/ELT pipelines of large‑scale operational and financial data in a cloud environment.
  • Pipeline Development: Proficiency in building and optimizing data pipelines using Azure Data Factory and Databricks.
  • Technical Proficiency (SQL & Python): Strong proficiency in SQL for data analysis and Python for scripting and transformation.
  • Data Quality Assurance: Experience implementing automated data quality checks (e.g., schema validation, null checks). A proactive approach to identifying pipeline failures and implementing fixes to prevent recurrence.
  • Platform & Data Familiarity: Experience working with data schemas and APIs from common enterprise platforms like Microsoft Dynamics 365 F&O, Salesforce, ServiceNow.
  • LLM Application: Demonstrated experience using LLMs to streamline data engineering workflows and improve development efficiency.
  • Education: Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a closely related technical field.
Preferred Qualifications
  • Prior experience working in a technology company or SaaS environment
Total Rewards

Want to love where you work? AtIntrado, we offer a comprehensive benefits package that includes what you’d expect (medical, dental, vision, life and disability coverage, paid time off, aRegistered RetirementSavings Plan (RRSP), and several that go above and beyond– tuition reimbursement,paid parental leave, access to acomprehensivelibrary of personal and professional training resources, employee discounts,insurance coverageand more! Apply today to join us inwork worth doing!

The starting salary is anticipated between $185,000 and $200,000 CAD and will be commensurate with experience. This position is bonus eligible.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.