Enable job alerts via email!

Data Engineer

Envision Financial

Toronto

Hybrid

CAD 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Job summary

A financial services company in Toronto is seeking a Data Engineer responsible for designing data pipelines and managing data assets within Microsoft and Oracle environments. The ideal candidate will have at least 5 years of experience in data engineering, strong problem-solving skills, and proficiency in SQL and Python. This role requires excellent communication skills and a detail-oriented approach to data quality. You will be working on data integration, performance monitoring, and creating actionable insights using Power BI.

Qualifications

  • 5 years of experience in data engineering or related roles.
  • Exceptional problem-solving and analytical skills.
  • Excellent communication skills for non-technical stakeholders.
  • Ability to manage multiple projects under pressure.
  • Detail-oriented with a commitment to data quality.

Responsibilities

  • Design and develop scalable data pipelines and ETL processes.
  • Create and manage databases and data warehouses.
  • Implement data quality controls and governance.
  • Design efficient data models for analytics.
  • Monitor data pipeline performance and troubleshoot issues.
  • Develop reports and dashboards in Power BI.

Skills

Data engineering
Data integration
SQL
Python
Problem-solving
Communication
Attention to detail
Data quality

Education

Bachelor’s degree in computer science or related field

Tools

Microsoft Azure
SQL Server
Oracle Database
Power BI
Oracle Data Integrator

Job description

We are currently seeking a Data Engineer to join our team.

The Data Engineer is responsible for designing, building, and managing assets and pipelines for data collection, storage, and analysis within Microsoft and Oracle environments. This role involves integrating large datasets from multiple sources to ensure data is accessible, consistent, and of high quality for reporting, dashboards, insights, and analytics. Experience is required with Microsoft Azure cloud services, SQL Server, Oracle Database, and Oracle Cloud Infrastructure to develop scalable data solutions. Additionally, responsibilities include ensuring our data pipelines are robust and secure, supporting data-driven decision-making across the organization.

We are open to filling this role across Canada; however, please note that preference will be given to candidates located in BC due to the proximity of our office locations. To be considered for this role, when applying, you must be eligible to work in Canada.

Here’s what would be included as a part of your typical day

  1. Design and Develop Data Pipelines: Creates scalable data pipelines and ETL processes to extract, transform, and load (ETL) data from Microsoft and Oracle systems into centralized repositories. Use Fabric, Azure Data Factory, and Oracle Data Integrator for efficient data extraction and transformation.
  2. Data Storage & Management: Create and manage databases, data warehouses, and data lakes on Microsoft SQL Server/Azure and Oracle. Ensure high performance and reliability by designing, indexing, and partitioning relational databases. Monitor and tune database health to meet business needs.
  3. Ensure Data Quality, Security, and Governance: Implement data quality controls, validation rules, and cleansing processes to maintain the integrity and accuracy of data within pipelines. Adhere to data governance standards and security protocols across both platforms, incorporating data masking or encryption where necessary, to ensure compliance with industry regulations and corporate policies.
  4. Develop and Optimize Data Models: Design efficient data models and schemas for analytics and reporting on Azure and Oracle. Structure databases and data warehouses to support querying and BI reporting, and update architecture as needed.
  5. Data Insights Support: Work with analytics teams to ensure that data is available and appropriately structured for BI tools. Provide curated datasets or views for reporting and assist in the creation of dashboards and reports, particularly with Microsoft Power BI, to convert raw data into actionable insights.
  6. Performance Monitoring & Troubleshooting: Monitor data pipeline and database performance and quickly resolve any issues. Use monitoring tools and logs to identify anomalies or bottlenecks. Optimize SQL queries, ETL jobs, and storage designs to improve throughput and reduce latency or cost.
  7. Reporting and Dashboarding: Design and develop efficient reports and dashboards in Power BI, Microsoft Fabric, and Oracle Analytics Cloud Platforms.

Required Skills, Experience & Qualifications

  • Bachelor’s degree in computer science, Information Systems, or a related field required.
  • 5 years of professional experience in data engineering, data platform management, or related roles, or a combination of education and experience.
  • Exceptional problem-solving and analytical skills to troubleshoot complex data issues and devise efficient solutions.
  • Excellent communication skills, with the ability to translate complex technical information into business terms for non-technical stakeholders.
  • Ability to work under pressure and manage multiple projects and deadlines efficiently.
  • Meticulous attention to detail and a commitment to data accuracy and quality.
  • A growth mindset and eagerness to stay updated with new technologies and industry best practices.
  • Expertise in data engineering, data management, and data integration tools and technology stack.
  • Expertise in data modeling, database design, and data integration.
  • Proficiency in SQL, Python, and other relevant programming languages.
  • Displays an understanding of risk and risk ownership by being able to demonstrate adherence to policies and procedures.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs