Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineering Lead

Cognizant

Remote

CAD 120,000 - 150,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading tech company is seeking a Data Engineering Lead in Mississauga to drive modernization and reliability of enterprise data platforms. Responsibilities include leading a large engineering team, optimizing DBT-Snowflake workflows, and establishing DevOps pipelines. The ideal candidate will have strong experience with ETL tools, cloud transformation, and a background in data governance. Join a supportive workplace that values work-life balance and innovative solutions.

Qualifications

  • Strong hands-on experience with ETL and data engineering tools.
  • Demonstrated success leading large-scale cloud transformation initiatives.
  • Experience designing and delivering high-volume ETL workflows.
  • Proficiency with DevOps tools and CI/CD pipeline setup.
  • Familiarity with incident management tools.

Responsibilities

  • Lead the data engineering strategy and delivery for enterprise Member Data Products.
  • Oversee and mentor a team of 30 engineering professionals.
  • Re-architect and optimize DBT–Snowflake pipelines.
  • Establish and maintain DevOps pipelines integrating automated validation.
  • Partner with leaders to define data quality and monitoring standards.

Skills

DBT
Informatica PowerCenter
Snowflake
SQL
PL/SQL
Shell Scripting
Python
DevOps

Tools

Jira
Tidal
Autosys
Jenkins
Azure DevOps
Job description
About the role

As a Data Engineering Lead, you will make an impact by driving the modernization, optimization, and reliability of enterprise data platforms. You will be a valued member of the Data Engineering & Analytics team and work collaboratively with engineering leads, business stakeholders, cloud architects, and cross‑functional technology teams to deliver high‑quality, scalable data solutions.

In this role, you will:
  • Lead the end‑to‑end data engineering strategy and delivery for enterprise Member Data Products, ensuring alignment with analytics, compliance, and governance needs.
  • Oversee and mentor a team of 30 engineering professionals, guiding architecture modernization from Informatica PowerCenter to DBT and modern CI/CD practices.
  • Re‑architect and optimize DBT–Snowflake pipelines, achieving significant performance improvements across high‑volume data workloads.
  • Establish and maintain DevOps pipelines integrating automated validation and standardized release processes.
  • Partner with business and technology leaders to define data quality, monitoring, and change‑management standards aligned with enterprise governance.
  • Deliver secure, compliant, and reliable ETL solutions supporting regulatory, reporting, and operational systems.
Work model

We strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a remote position open to qualified applicants in Mississauga, ON. Regardless of your working arrangement, we are here to support a healthy work‑life balance though our various wellbeing programs.

The working arrangements for this role are accurate as of the date of posting and may change based on project or client requirements.

What you need to have to be considered
  • Strong hands‑on experience with ETL and data engineering tools: DBT, Informatica PowerCenter, Snowflake, Netezza, Oracle, SQL Server, Neo4j.
  • Demonstrated success leading large‑scale cloud transformation initiatives, including Snowflake migration and ETL modernization (e.g., 75%+ reduction in processing time).
  • Experience designing and delivering high‑volume ETL workflows, including Shell Scripting automation and PL/SQL‑based performance optimization.
  • Proficiency with scheduling and DevOps tools such as Tidal, Autosys, Jenkins, Azure DevOps, and CI/CD pipeline setup.
  • Strong SQL, PL/SQL, and basic Python skills applicable to data engineering workloads.
  • Experience supporting UAT, deployments, and cross‑functional stakeholder engagement in regulatory or compliance‑driven environments.
  • Familiarity with incident management tools like Jira or Service Marketplace.
These will help you stand out
  • Experience leading large engineering teams, including coaching, technical mentorship, and productivity improvement efforts.
  • Proven ability to consolidate disparate data platforms into unified, high‑accuracy reporting environments.
  • Expertise optimizing DBT–Snowflake workflows (e.g., 300× performance improvements).
  • Strong understanding of enterprise data governance, data quality frameworks, and monitoring best practices.
  • Background working with Azure or AWS cloud platforms in enterprise data ecosystems.

We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.