Enable job alerts via email!

AWS Data Engineer

Epsilon Solutions Ltd

Toronto

On-site

CAD 80,000 - 100,000

Part time

Today
Be an early applicant

Job summary

A data engineering firm is looking for an AWS Data Engineer in Toronto, ON. You will design scalable data pipelines, develop ETL processes, and implement AWS services. Candidates should have strong AWS, SQL, and communication skills. The role emphasizes automation and project leadership with a contract term.

Qualifications

  • Expert knowledge in SQL is mandatory.
  • Strong experience with AWS services like Redshift, Glue, and Lambda.
  • Intermediate knowledge in Python and Pyspark is required.

Responsibilities

  • Design and build scalable data pipelines using AWS services.
  • Develop efficient ETL processes for data transformation and loading.
  • Automate tasks and build reusable frameworks for efficiency.

Skills

SQL - Expert
AWS (Redshift / Lambda / Glue / SQS / SNS / CloudWatch / Step function / CDK or Terraform) - Expert
Pyspark - Intermediate / Expert
Python - Intermediate
Strong Communication Skills
Job description
Overview

Title: AWS Data Engineer

Location: TORONTO, ON.

Terms: Contract

Responsibilities
  • Design and build scalable data pipelines using AWS services like AWS Glue, Amazon Redshift, and QS / SNS / Cloudwatch / Step function / CDK (or Terraform).
  • Develop efficient ETL processes for data extraction, transformation, and loading into data warehouses and lakes.
  • Create and manage applications using Python, PySpark, SQL, Databricks, and various AWS technologies.
  • Automate repetitive tasks and build reusable frameworks to improve efficiency.
  • Involved in Architecture Design and Review, Team Leadership Skills, Management Skills, Code Review & Sign-Off.
  • Code & Architecture Review - Ensuring quality and scalability of code / data pipelines.
  • Team Leadership Skills - Leading sprint planning, retrospectives and assigning and reviewing tasks.
  • Project Management Skills - Agile / Scrum methodology, risk & deadline management.
  • Mentoring & Coaching - Upskilling team on AWS best practices and data architecture.
  • Stakeholder Communication - Translating tech into business value and reporting to non-technical managers.
Qualifications & Skills
  • SQL - Expert (Must have)
  • AWS (Redshift / Lambda / Glue / SQS / SNS / CloudWatch / Step function / CDK (or Terraform)) - Expert (Must have)
  • Pyspark - Intermediate / Expert
  • Python - Intermediate (Must have or Pyspark knowledge)
  • Strong Communication Skills
Experience & Focus Areas
  • Code & Architecture Review - Ensuring quality and scalability of code / data pipelines.
  • Exposure to cost optimization strategies in AWS, Data Lake vs. Data Warehouse Architecture, Designing Scalable ETL Pipelines, Real-Time Data Ingestion, Data Partitioning and Performance Optimization, Security and Compliance, Data Quality.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.