Job Search and Career Advice Platform

Enable job alerts via email!

DBT Developer

Cognizant

Remote

CAD 80,000 - 100,000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global technology company is seeking a DBT Developer to lead data engineering strategies and optimize data platforms. This remote position requires strong hands-on experience with ETL tools like DBT, Informatica, and Snowflake. You will oversee a team, mentor engineering professionals, and establish DevOps processes. The ideal candidate will excel in cloud transformation and possess skills in SQL, PL/SQL, and Python. Join us to deliver high-quality, scalable data solutions in a collaborative environment.

Benefits

Flexibility in working arrangements
Well-being programs

Qualifications

  • Strong hands-on experience with ETL and data engineering tools.
  • Proficiency with scheduling and DevOps tools.
  • Experience leading large-scale cloud transformation initiatives.

Responsibilities

  • Lead end-to-end data engineering strategy and delivery for Member Data Products.
  • Oversee and mentor a team of 30 engineering professionals.
  • Establish and maintain DevOps pipelines.

Skills

ETL and data engineering tools
SQL
Leadership
Cloud transformation
PL/SQL
Python

Tools

DBT
Snowflake
Informatica PowerCenter
Jira
Azure DevOps
Autosys
Tidal
Job description
About the role

As a DBT Developer, you will make an impact by driving the modernization, optimization, and reliability of enterprise data platforms. You will be a valued member of the Data Engineering & Analytics team and work collaboratively with engineering leads, business stakeholders, cloud architects, and cross‑functional technology teams to deliver high‑quality, scalable data solutions.

In this role, you will:
  • Lead the end‑to‑end data engineering strategy and delivery for enterprise Member Data Products, ensuring alignment with analytics, compliance, and governance needs.
  • Oversee and mentor a team of 30 engineering professionals, guiding architecture modernization from Informatica PowerCenter to DBT and modern CI/CD practices.
  • Re‑architect and optimize DBT–Snowflake pipelines, achieving significant performance improvements across high‑volume data workloads.
  • Establish and maintain DevOps pipelines integrating automated validation and standardized release processes.
  • Partner with business and technology leaders to define data quality, monitoring, and change‑management standards aligned with enterprise governance.
  • Deliver secure, compliant, and reliable ETL solutions supporting regulatory, reporting, and operational systems.
Work model

We strive to provide flexibility wherever possible. Based on this role’s business requirements, this is a remote position open to qualified applicants in Mississauga, ON. Regardless of your working arrangement, we are here to support a healthy work-life balance though our various wellbeing programs.

The working arrangements for this role are accurate as of the date of posting and may change based on project or client requirements.

What you need to have to be considered
  • Strong hands‑on experience with ETL and data engineering tools: DBT, Informatica PowerCenter, Snowflake, Netezza, Oracle, SQL Server, Neo4j.
  • Demonstrated success leading large‑scale cloud transformation initiatives, including Snowflake migration and ETL modernization (e.g., 75%+ reduction in processing time).
  • Experience designing and delivering high‑volume ETL workflows, including Shell Scripting automation and PL/SQL‑based performance optimization.
  • Proficiency with scheduling and DevOps tools such as Tidal, Autosys, Jenkins, Azure DevOps, and CI/CD pipeline setup.
  • Strong SQL, PL/SQL, and basic Python skills applicable to data engineering workloads.
  • Experience supporting UAT, deployments, and cross‑functional stakeholder engagement in regulatory or compliance‑driven environments.
  • Familiarity with incident management tools like Jira or Service Marketplace.
These will help you stand out
  • Experience leading large engineering teams, including coaching, technical mentorship, and productivity improvement efforts.
  • Proven ability to consolidate disparate data platforms into unified, high‑accuracy reporting environments.
  • Expertise optimizing DBT–Snowflake workflows (e.g., 300× performance improvements).
  • Strong understanding of enterprise data governance, data quality frameworks, and monitoring best practices.
  • Background working with Azure or AWS cloud platforms in enterprise data ecosystems.

We're excited to meet people who share our mission and can make an impact in a variety of ways. Don't hesitate to apply, even if you only meet the minimum requirements listed. Think about your transferable experiences and unique skills that make you stand out as someone who can bring new and exciting things to this role.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.