
Enable job alerts via email!
Generate a tailored resume in minutes
Land an interview and earn more. Learn more
A leading organization in the healthcare sector is looking for a Senior Data Engineer to develop data pipelines and handle ETL management. This role demands expertise in Informatica and SQL, along with a background in data architecture and analytics. The selected candidate will contribute to critical analytics projects, ensuring efficient data delivery and compliance with data governance standards. The role offers potential travel within Canada and a contract term from December 1, 2025, to March 31, 2026.
A leading organization in the healthcare sector is seeking up to two (2) highly skilled and experienced Senior Data Engineers to join our team. This is a highly visible, busy, and challenging role focused on supporting major analytics projects across the healthcare domain.
The successful candidate(s) will be crucial in refining data requirements with customers, developing robust data models, and designing and building optimal data extraction, transformation, and loading (ETL) infrastructure. You will work with diverse healthcare data, ensuring deliverables are met on time and on budget.
This role reports to the Manager of Data Management and will work closely with a project team led by a Project Manager.
Contract Term: Initial contract runs from December 1, 2025 to March 31, 2026, with expected renewals based on performance and project needs beyond the fiscal year end.
Remote Resource(s) can be proposed and can be located anywhere in Canada.
Remote resources selected will be expected to occasionally travel to the main work location, with travel costs covered by the Successful Supplier.
The successful candidate(s) will be responsible for, but not limited to, the following tasks:
Data Pipeline Development: Design and build the required infrastructure for optimal extracting, transformation, and loading of data from a wide variety of data sources using Informatica (Power Centre, Integration Services, IDMC, TDM), SQL, SQL Server Integration Services (SSIS), Application Programming Interfaces (API), and other technologies.
Data Architecture: Architect relational and multi-dimensional databases from structured, semi-structured, and unstructured data, utilizing development techniques including star and snowflake schemas, ETL, Slowly Changing Dimensions (SCD), and Fact and Cube development within an established data platform infrastructure.
Process Improvement: Identify, design, and implement internal process improvements, focusing on automating manual processes, optimizing data delivery, and re-designing data pipelines for enhanced scalability.
Analytics Enablement: Build analytics tools that leverage the data pipeline to provide actionable insights into key business performance metrics, such as customer acquisition and operational efficiencies.
ETL Management: Develop, maintain, optimize, troubleshoot, debug, monitor, and manage backup and recovery operations for the ETL environment.
Data Governance: Analyze datasets to ensure compliance with data sharing agreements, legislative restrictions, and alignment with data architecture guidelines.
Mentorship: Mentor, support, and train Information Analyst and Junior Data Management resources, as required.
The proposed Resource(s) must meet the following criteria:
Experience: A minimum of five (5) years of proven experience working as a Data Engineer or similar role.
Education: Possess a Bachelor’s degree in Information Technology, Engineering, or Computer Science, OR a diploma in a related data management/information technology stream with an additional three (3) years of experience (total of eight years) working as a Data Engineer or similar role.
Informatica Expertise: A minimum of two (2) years of proven experience using Informatica software (i.e., Power Centre, Integration Services, Workflow Manager, IDMC, and TDM) in an integrated support environment.
Location: Must be located and permitted to work in Canada.
The proposed resource(s) should possess the following qualifications and have experience fulfilling the corresponding job requirements:
The successful candidate must possess expert-level knowledge in the following technical areas:
Database Management Systems (DBMS): Expert knowledge of Oracle and SQL Server Database Management Systems and associated tools.
ETL and Data Pipelines: Expert-level ETL and data pipeline development experience, including providing technical consulting and guidance to development teams for the design and development of highly complex or critical ETL architecture.
Computer Programming Languages: Expert knowledge of programming languages such as PL/SQL, R, and Python.
Operating Systems: Expert knowledge of operating systems such as Unix, Linux, and Windows.
Shell Scripting: Proficiency in Shell Scripting language.
Data APIs: Expert knowledge of Data Application Programming Interface (API) development and utilization.
Theoretical Foundations: Expert knowledge of Algorithms and data structures.
The successful candidate must have extensive practical experience in the following areas:
Information Management & Design: Extensive experience with information management, logic modeling, conceptual design, business process design, and workflow design.
System Lifecycle: Extensive experience in requirements gathering, analysis, planning, design, development, implementation, and maintenance of Data Management systems.
Problem-Solving: Extensive experience with critical, constructive, and creative problem-solving skills that involve issue identification, development of objectives, development of an action plan overseeing what needs to be done, while identifying the resources required to ensure quality products.
Cloud Platforms: Extensive experience working with Cloud platforms for data management (e.g., Azure, AWS, GCP).
The following qualifications and experience would be considered a significant asset:
Certification: Possessing a Microsoft Certified: Azure Data Engineer Associate certification.
Industry Experience: Experience working specifically with healthcare data.