Job Search and Career Advice Platform

Enable job alerts via email!

Senior Data Engineer: Scalable Cloud Data Pipelines

fox com

Toronto

Hybrid

CAD 90,000 - 120,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading media company based in Toronto is seeking a Data Engineer to build scalable data systems. The role requires 5+ years of experience in data engineering, specifically in ETL, data modeling, and AWS services. This position focuses on designing data pipelines and optimizing data storage using tools such as Databricks and Snowflake. Candidates should have strong coding skills in Python or Scala and experience with data monitoring services. Join a collaborative team that tackles high-impact data challenges in a dynamic environment.

Benefits

Work on high-impact data challenges
Collaborative team environment
Access to modern data platforms and tools

Qualifications

  • 5+ years of hands-on experience in data engineering with ETL development.
  • Proficiency in Databricks and Snowflake.
  • Strong coding skills in Python or Scala.
  • Experience with AWS data services like Glue and Lambda.
  • Familiarity with monitoring tools like Datadog.

Responsibilities

  • Design ETL pipelines to efficiently manage large-scale data.
  • Develop data models and pipelines in Databricks and Snowflake.
  • Implement workflows using AWS services.
  • Write scalable code for data transformation in Python or Scala.
  • Manage data quality and monitoring frameworks.

Skills

Data engineering
Python
AWS
ETL development
Data modeling
Databricks
Snowflake

Education

Bachelor's degree in Computer Science or equivalent experience

Tools

Datadog
Airflow
Terraform
Job description
A leading media company based in Toronto is seeking a Data Engineer to build scalable data systems. The role requires 5+ years of experience in data engineering, specifically in ETL, data modeling, and AWS services. This position focuses on designing data pipelines and optimizing data storage using tools such as Databricks and Snowflake. Candidates should have strong coding skills in Python or Scala and experience with data monitoring services. Join a collaborative team that tackles high-impact data challenges in a dynamic environment.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.