Job Search and Career Advice Platform

Enable job alerts via email!

Software Engineer, Data Operations

Flosonics

Toronto

On-site

CAD 80,000 - 100,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A medical technology company located in Toronto is seeking a Data Engineer to strengthen its data infrastructure. The successful candidate will develop and manage systems for real-time data monitoring and algorithm support. Ideal applicants have a Bachelor’s degree in a relevant discipline, extensive Python and SQL skills, and at least 5 years of related experience. The position offers an opportunity to work on cutting-edge medical devices for improved patient treatment and outcomes.

Qualifications

  • 5+ years of industry experience in data engineering, ML Ops, or similar roles.
  • Demonstrated experience in developing and monitoring algorithms for data labeling and quality control.
  • Strong skills in Python, especially with pandas and NumPy.

Responsibilities

  • Develop, maintain, and monitor infrastructure for generating ground truth data.
  • Create and maintain datasets to support model training and evaluation.
  • Work with engineering teams on automated reporting systems.

Skills

Python
SQL
Machine Learning
Data Engineering
Data Visualization

Education

Bachelor's degree in a relevant field
Master's degree in Machine Learning or Data Engineering

Tools

PostgreSQL
DBT
Airflow
Streamlit
Tableau
Job description

We are a team of passionate medical and technological innovators on a mission to improve patient treatment and outcomes with cutting‑edge medical devices like the FloPatch. FloPatch is the world’s first wireless Doppler ultrasound system designed to support the clinical management of critically ill patients. The wearable sensor enables real‑time functional hemodynamic monitoring for patients requiring cardiopulmonary and fluid resuscitation. The successful candidate will assist Flosonics Medical in introducing FloPatch to the world.

This role will play a critical part in strengthening the company’s data infrastructure and monitoring capabilities as we continue to scale. The individual will be responsible for supporting data management and performance monitoring across multiple teams, with a particular focus on maintaining high‑quality ground truth assets for the algorithm team and improving dataset organization and oversight. In addition, this role will provide essential support to the quality and production teams by helping automate log generation and enhance data input mechanisms, addressing current gaps and enabling more efficient, reliable workflows across the organization.

Responsibilities
  • Develop, maintain, and monitor infrastructure that generates ground truth data that is used for algorithm development.
  • Maintain and create algorithm datasets to support model training and evaluation, aligning with machine learning best practices.
  • Work alongside engineering teams to develop automated reporting systems for stakeholders.
  • Maintain dashboards or reports that summarize KPIs.
  • Develop and maintain systems that monitor and flag trends in algorithm underperformance to the algorithm development team.
  • Maintain data warehouse architecture & raw data structure.
  • Maintain and document data definitions, schemas, and contracts.
Qualifications and Education Requirements
  • Bachelor’s degree in Computer Science, Data Science, Software Engineering, Mathematics, Information Systems, Biomedical Engineering or a related field required.
  • Master’s degree in Machine Learning, Data Engineering, or a closely related field preferred.
  • 5+ years of industry experience in data engineering, machine learning operations (ML Ops), algorithm development, or similar technical roles.
  • Demonstrated experience developing and monitoring algorithms used in data labeling, quality control, or automated evaluation systems.
  • Strong Python skills (pandas, NumPy, plotly, data processing, scripting).
  • Strong SQL skills and experience with relational databases (e.g., PostgreSQL) and data orchestration tools (e.g. DBT, Airflow).
  • Experience with dashboard and visualization tools (e.g., Streamlit, Tableau, Dash).

We wish to thank all applicants, however, only those selected for an interview will be contacted directly. If you are selected to participate in the recruitment process, accommodations are available upon your request to meet your accessibility needs.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.