Enable job alerts via email!

SENIOR DATA ENGINEER TSD

City of Toronto

Toronto

On-site

CAD 90,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A major Canadian city in Toronto is seeking a skilled Senior Data Engineer to join their Enterprise Data Platform team. This position involves designing and implementing AWS-based data solutions that enhance city services. The ideal candidate should possess strong AWS and Python skills, with a background in data engineering and a commitment to building efficient data infrastructures. Join a dynamic team dedicated to making a meaningful impact in the community.

Benefits

Opportunity for professional growth
Collaborative work environment
Impactful projects in community

Qualifications

  • 5+ years of experience in data engineering or related fields.
  • Deep expertise in AWS technologies, particularly in data-related services.
  • Strong proficiency in Python for data processing.

Responsibilities

  • Utilize AWS services to build and maintain data infrastructure.
  • Design and implement ETL/ELT processes using AWS Glue and Apache Spark.
  • Contribute to the implementation of a data mesh architecture.

Skills

AWS Technologies
Python Programming
Data Mesh Architecture
Apache Spark
ETL/ELT Processes
Infrastructure as Code
Data Governance
Problem-Solving
Communication Skills

Education

Bachelor's degree in Computer Science or related field
Master's degree (preferred)

Tools

AWS Glue
Terraform
DBT
DataBricks
Snowflake
Job description

The City of Toronto is seeking a skilled and experienced Senior Data Engineer to join our Enterprise Data Platform team. This role is vital in supporting the design, development, and implementation of our Enterprise Data Platform. The ideal candidate will have a strong background in AWS technologies, data engineering, and modern data architectures.

As a Senior Data Engineer at the City of Toronto, you will have the opportunity to work on cutting-edge data solutions that directly impact the lives of Toronto's residents. You'll be part of a team driving the city's digital transformation, working on projects that enhance city services and operations through innovative data utilization.

You'll work in a collaborative environment that values your expertise and provides opportunities for professional growth. If you're passionate about leveraging data and AWS technologies to create meaningful change, we encourage you to apply and be part of our mission to build a smarter, more connected Toronto.

Key Responsibilities:
  • Utilize a wide range of AWS services to build and maintain scalable, secure, and efficient data infrastructure. Key services include S3, Redshift, Kinesis, EMR, Glue, Data Zone, Lake Formation, and CloudFormation.
  • Design, implement, and maintain robust ETL/ELT processes using tools such as AWS Glue, DBT (Data Build Tool), and Apache Spark.
  • Contribute to the implementation of a data mesh architecture, enabling decentralized, domain-oriented data ownership and management.
  • Develop and maintain infrastructure as code using Terraform or AWS CloudFormation to automate and streamline the deployment of cloud resources.
  • Utilize Python and Apache Spark for large-scale data processing, transformation, and analysis.
  • Design and implement efficient data models to support analytics, machine learning, and reporting needs.
  • Develop and maintain both batch and real-time data streaming solutions using technologies such as AWS Kinesis.
  • Implement and adhere to data governance policies to ensure data quality, privacy, and compliance with regulations.
  • Work with technologies such as Databricks and Snowflake to enhance the capabilities of the data platform.
  • Work closely with data scientists, analysts, and other stakeholders to understand data requirements and provide tailored solutions.
  • Create and maintain comprehensive documentation for data processes, pipelines, and models. Share knowledge with team members and contribute to the team's overall growth.
Required Qualifications:
  • Bachelor's degree in Computer Science, Data Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering or related fields.
  • Deep expertise in AWS technologies, particularly in data-related services (S3, Redshift, Kinesis, EMR, Glue, etc.).
  • Strong proficiency in Python programming, especially for data processing tasks.
  • Experience with big data processing frameworks, particularly Apache Spark.
  • Hands-on experience with ETL/ELT processes and tools like AWS Glue and DBT.
  • Solid understanding of data modeling concepts and techniques.
  • Experience with infrastructure as code, preferably using Terraform or AWS CloudFormation.
  • Familiarity with data governance principles and privacy regulations (e.g., GDPR, CCPA).
Preferred Qualifications:
  • Master's degree in a relevant field.
  • AWS certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Big Data - Specialty).
  • Experience with data mesh architecture concepts and implementation.
  • Knowledge of other cloud platforms (e.g., Azure, GCP) for multi-cloud strategies.
  • Familiarity with containerization technologies (e.g., Docker, Kubernetes).
  • Experience with CI/CD practices and tools.
  • Understanding of machine learning workflows and MLOps practices.
Key Skills:
  • Strong problem-solving and analytical skills
  • Excellent communication skills, able to explain complex technical concepts to non-technical stakeholders
  • Self-motivated with the ability to work independently and as part of a team
  • Attention to detail and commitment to delivering high-quality work
  • Adaptability and willingness to learn new technologies and methodologies
  • Time management skills and ability to handle multiple projects simultaneously
Equity, Diversity and Inclusion

The City is an equal opportunity employer, dedicated to creating a workplace culture of inclusiveness that reflects the diverse residents that we serve.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.