Data Engineer – Snowflake & DBT (Remote)
Design and optimize cutting-edge data pipelines and warehouse solutions using Snowflake and DBT in a fully remote role across Québec or Ontario. This permanent opportunity offers a salary of $82–90K (negotiable based on experience) and the chance to work in a dynamic, cloud-based environment with strategic impact.
What is in it for you
What is in it for you :
- Salary starting at $82.000 (negotiable based on experience).
- Annual bonus based on individual performance and company profitability, paid at the end of the fall.
- Permanent full-time position (40 hours / week), Monday to Friday, between 8 am and 5 pm.
- 3 weeks of vacation per year, depending on seniority.
- Comprehensive benefits package available after 90 days : dental and medical insurance, massage therapy, chiropractic care, and more.
- Retirement savings plan : voluntary contribution of up to 3% of salary, with matching employer contribution.
Responsibilities
- Design, build, and maintain data pipelines, warehouses, and data models using Snowflake and DBT.
- Collaborate with cross-functional teams to gather data requirements and develop efficient data architectures.
- Implement and manage ETL / ELT processes across structured and unstructured data sources using tools such as Azure Data Factory and SQL.
- Enforce data governance protocols including quality, lineage, metadata management, and security compliance.
- Monitor system performance, conduct tuning, and proactively address bottlenecks.
- Maintain documentation of data processes, architecture, and technical specifications.
- Contribute to team knowledge by supporting peers and staying current on data engineering trends.
What you will need to succeed
- Bachelor's or graduate degree in computer engineering, data science, mathematics, or a related discipline.
- Relevant certifications in Azure Data Services or Snowflake are considered an asset.
- 4–6 years of experience in data engineering or a related field.
- Proficient in SQL and familiar with both relational and NoSQL databases (e.g., MS SQL Server, Snowflake, PostgreSQL, Cosmos DB).
- Hands-on experience with Snowflake and DBT for warehousing and data transformation.
- Skilled in designing and optimizing data pipelines and ETL / ELT workflows.
- Experience with cloud platforms, particularly Azure, and cloud-based storage systems.
- Familiarity with data pipeline and orchestration tools such as Azure Data Factory, Airflow, Azkaban, or Luigi.
- Experience leveraging REST APIs for data integration.
- Comfortable working in multidisciplinary teams to address complex data processing challenges.
- English and French to support data governance, documentation, and collaboration across teams in both languages.