Enable job alerts via email!

Intermediate Data Engineer ( SQL, Databricks/ Snowflakes, DBT, Airflow)

TEEMA Solutions Group

Toronto

Hybrid

CAD 80,000 - 100,000

Full time

Today
Be an early applicant

Job summary

A prominent staffing agency in Toronto seeks an Intermediate Data Engineer to join a high-impact Data & Analytics team. The candidate will design ETL/ELT data pipelines and collaborate with business teams to provide technical solutions. Applicants should have 4-6 years of data engineering experience and strong skills in SQL and Snowflake or Databricks. The position offers a hybrid working model.

Qualifications

  • 4-6 years of data engineering experience.
  • Solid SQL and experience with Snowflake or Databricks.
  • Experience with CI/CD, data lineage, and monitoring tools.

Responsibilities

  • Design, build, and maintain robust ETL/ELT data pipelines.
  • Collaborate with analysts and business teams for technical solutions.
  • Ensure data quality, observability, and governance across systems.

Skills

SQL
Snowflake
Databricks
DBT
Airflow
Cloud data migration
Data engineering

Tools

AWS
CI/CD tools
Job description
Overview

DATA ENGINEER- Toronto ( 3 days downtown/Hybrid)

Our client is interviewing to onboard a Data Engineer to join their Data & Analytics team. We are looking for an Intermediate Data Engineer with proven track record working within the Financial sector. You’ll help build and optimize scalable data pipelines, support data governance, and enable advanced analytics across the organization. Be part of a high-impact team driving data innovation in the wealth tech space.

Responsibilities
  • Design, build, and maintain robust ETL/ELT data pipelines.
  • Collaborate with analysts and business teams to translate needs into technical solutions.
  • Recommend scalable data architectures using AWS, Snowflake, and modern data stack tools.
  • Ensure data quality, observability, and governance across systems.
  • Support migration projects to cloud-based platforms.
Qualifications
  • 4-6 years of data engineering experience.
  • Solid SQL and either Snowflake or Databricks, plus DBT and/or Airflow
  • Experience with CI/CD, data lineage, and monitoring tools.
  • Financial services experience (Nice to have)
Screening Questions
  1. 1-How proficient are you with programming languages and tools for data engineering tasks (e.g., SQL, Python, Airflow, Terraform)? how many years?
  2. 2- Can you describe your experience designing, building, and maintaining ETL/ELT pipelines? Which tools and technologies did you use?
  3. 3-Have you worked directly with analysts or business stakeholders to translate business needs into technical data solutions? Can you share an example?
  4. 4-What is your experience designing data architectures on AWS using services like Glue, S3, Lambda, and Redshift or Snowflake?
  5. 5-How do you ensure data quality, observability, and lineage in your pipelines? What tools or practices have you implemented?
  6. 6-Have you participated in or led a cloud data migration project? What was your role and what challenges did you face?
  7. 7- Have you worked with CI/CD pipelines for data deployments or infrastructure? What tools did you use and how did you manage rollouts?
  8. 8-Do you have experience working in financial services or other regulated industries? If so, what types of data and systems were involved?

Send your resume and reply to above screening questions to : sasha@talenttohire.com

For more positions, visit: https://www.linkedin.com/company/talenttohire/jobs

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.