Enable job alerts via email!

Data Engineering Manager

TEEMA Solutions Group

Toronto

Hybrid

CAD 120,000 - 150,000

Full time

Today
Be an early applicant

Job summary

A technology solutions firm in Toronto is seeking a Data Engineering Manager to architect, implement, and optimize data solutions using Databricks. The role requires leading a technical team, managing cloud infrastructure, and ensuring data processing excellence. Ideal candidates will have extensive experience in Databricks and strong programming skills in Python or Scala. This position offers hybrid work with potential for some travel.

Qualifications

  • 5+ years of hands-on experience with Databricks and Apache Spark.
  • Proven experience leading and mentoring data engineering teams.
  • Strong programming proficiency in Python (PySpark) or Scala.

Responsibilities

  • Architect and optimize end-to-end data solutions on Databricks.
  • Lead and mentor a team of data engineers.
  • Manage Databricks infrastructure and cost optimization.

Skills

Databricks
Apache Spark
Python (PySpark)
AWS services (S3, EC2, Lambda)
Agile methodologies
Terraform
SQL
CI/CD

Education

Bachelor’s Degree in Engineering or Computer Science

Tools

Git
AWS CloudFormation
Job description

Hybrid onsite 2 days/week or 8 days/month

1 interview will be onsite (2 interview process)

Databricks is a must

AWS or Azure

Must be able to do code reviews of Python code

Create designs and validate code

"own" the architecture

As the Data Engineering Manager, you will be responsible for architecting, implementing, and optimizing end-to-end data solutions on Databricks while integrating with core AWS services. You will lead a technical team of data engineers, ensuring best practices in performance, security, and scalability. This role requires a deep, hands‑on understanding of Databricks internals and a track record of delivering large‑scale data platforms in a cloud environment.

Lead a team of data engineers in the architecture and maintenance of Databricks Lakehouse platform, ensuring optimal platform performance and efficient data versioning using Delta Lake.

Manage and optimize Databricks infrastructure including cluster lifecycle, cost optimization, and integration with AWS services (S3, Glue, Lambda).

Design and implement scalable ETL/ELT frameworks and data pipelines using Spark (Python/Scala), incorporating streaming capabilities where needed.

Drive technical excellence through advanced performance tuning of Spark jobs, cluster configurations, and I/O optimization for large‑scale data processing.

Implement robust security and governance frameworks using Unity Catalog, ensuring compliance with industry standards and internal policies.

Lead and mentor data engineering teams, conduct code reviews, and champion Agile development practices while serving as technical liaison across departments.

Establish and maintain comprehensive monitoring solutions for data pipeline reliability, including SLAs, KPIs, and alerting mechanisms.

Configure and manage end-to-end CI/CD workflows using source control, automated testing, and version control.

Qualifications

Your Role:

You have a Bachelor’s Degree in Engineering, Computer Science or equivalent.

5+ years of hands‑on experience with Databricks and Apache Spark, demonstrating expertise in building and maintaining a production‑grade data pipelines.

Proven experience leading and mentoring data engineering teams in complex, fast‑paced environments.

Extensive experience with AWS cloud services (S3, EC2, Glue, EMR, Lambda, Step Functions).

Strong programming proficiency in Python (PySpark) or Scala, and advanced SQL skills for analytics and data modeling.

Demonstrated expertise in infrastructure as code using Terraform or AWS CloudFormation for cloud resource management.

Strong background in data warehousing concepts, dimensional modeling, and experience with RDBMS systems (e.g., Postgres, Redshift).

Proficiency with version control systems (Git) and CI/CD pipelines, including automated testing and deployment workflows.

Excellent communication and stakeholder management skills, with demonstrated ability to translate complex technical concepts into business terms.

Has demonstrated the use of AI in the development lifecycle.

Some travel may be required to the US.

Knowledge of financial industry will be preferred.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.