Enable job alerts via email!

•Data Engineer – Snowflake & DBT (100% remote) (WIN, ON)

Recrute Action

Windsor

Remote

CAD 82,000 - 90,000

Full time

Today
Be an early applicant

Job summary

A leading recruitment firm is seeking a Data Engineer to design and optimize data solutions using Snowflake and DBT. This fully remote role serves Ontario and offers a competitive salary of $82–90K, combined with a comprehensive benefits package and opportunities for annual bonuses. The ideal candidate will have 4–6 years of experience in data engineering, proficiency in SQL, and strong communication skills in English and French.

Benefits

Salary starting at $82,000
Annual bonus based on performance
3 weeks of vacation
Comprehensive benefits package
Retirement savings plan with employer contribution

Qualifications

  • 4–6 years of experience in data engineering or related field.
  • Hands-on experience with Snowflake and DBT for warehousing and data transformation.
  • Familiarity with both relational and NoSQL databases.

Responsibilities

  • Design, build, and maintain data pipelines, warehouses, and data models using Snowflake and DBT.
  • Collaborate with cross-functional teams to gather data requirements and develop efficient data architectures.
  • Implement and manage ETL / ELT processes across structured and unstructured data sources.

Skills

SQL
Snowflake
DBT
Azure Data Factory
Data governance
REST APIs
Collaboration in multidisciplinary teams

Education

Bachelor's or graduate degree in computer engineering, data science, mathematics, or related discipline

Tools

MS SQL Server
PostgreSQL
Cosmos DB
Airflow
Azkaban
Luigi
Job description
Data Engineer – Snowflake & DBT (Remote)

Design and optimize cutting-edge data pipelines and warehouse solutions using Snowflake and DBT in a fully remote role across Québec or Ontario. This permanent opportunity offers a salary of $82–90K (negotiable based on experience) and the chance to work in a dynamic, cloud-based environment with strategic impact.

What is in it for you

What is in it for you :

  • Salary starting at $82.000 (negotiable based on experience).
  • Annual bonus based on individual performance and company profitability, paid at the end of the fall.
  • Permanent full-time position (40 hours / week), Monday to Friday, between 8 am and 5 pm.
  • 3 weeks of vacation per year, depending on seniority.
  • Comprehensive benefits package available after 90 days : dental and medical insurance, massage therapy, chiropractic care, and more.
  • Retirement savings plan : voluntary contribution of up to 3% of salary, with matching employer contribution.
Responsibilities
  • Design, build, and maintain data pipelines, warehouses, and data models using Snowflake and DBT.
  • Collaborate with cross-functional teams to gather data requirements and develop efficient data architectures.
  • Implement and manage ETL / ELT processes across structured and unstructured data sources using tools such as Azure Data Factory and SQL.
  • Enforce data governance protocols including quality, lineage, metadata management, and security compliance.
  • Monitor system performance, conduct tuning, and proactively address bottlenecks.
  • Maintain documentation of data processes, architecture, and technical specifications.
  • Contribute to team knowledge by supporting peers and staying current on data engineering trends.
What you will need to succeed
  • Bachelor's or graduate degree in computer engineering, data science, mathematics, or a related discipline.
  • Relevant certifications in Azure Data Services or Snowflake are considered an asset.
  • 4–6 years of experience in data engineering or a related field.
  • Proficient in SQL and familiar with both relational and NoSQL databases (e.g., MS SQL Server, Snowflake, PostgreSQL, Cosmos DB).
  • Hands-on experience with Snowflake and DBT for warehousing and data transformation.
  • Skilled in designing and optimizing data pipelines and ETL / ELT workflows.
  • Experience with cloud platforms, particularly Azure, and cloud-based storage systems.
  • Familiarity with data pipeline and orchestration tools such as Azure Data Factory, Airflow, Azkaban, or Luigi.
  • Experience leveraging REST APIs for data integration.
  • Comfortable working in multidisciplinary teams to address complex data processing challenges.
  • English and French to support data governance, documentation, and collaboration across teams in both languages.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.