Enable job alerts via email!

Senior Cloud Data Warehouse Engineer

ALLTECH CONSULTING SVC INC

Quebec

On-site

CAD 90,000 - 130,000

Full time

Yesterday
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in technology consulting is seeking a Senior Cloud Data Warehouse Engineer to join their C3 Data Warehouse team. The role involves building a next-gen data platform utilizing Snowflake and Python, focusing on data sourcing and storage for reporting and analytics. The ideal candidate will have extensive experience in data solutions, strong analytical skills, and a collaborative approach to problem-solving.

Qualifications

  • Minimum 10 years of experience in data development.
  • At least 7 years of SQL / PLSQL experience.
  • Snowflake SnowPro Core certification required.

Responsibilities

  • Design, develop, and manage the Snowflake data warehouse.
  • Establish best practices for Snowflake usage.
  • Monitor query and data load performance.

Skills

SQL
Python
Data Warehousing
Snowflake
Data Pipeline Development
Performance Tuning
Collaboration
Problem Solving

Education

Bachelor’s degree in Computer Science

Tools

Snowflake
Airflow
DBT
Spark

Job description

Team Overview:

The Controls Engineering, Measurement and Analytics (CEMA) department is responsible for Cyber Risk and Control assessment, management, monitoring, and reporting capabilities across Technology, resulting in risk reduction and better oversight of the technology risk landscape of the firm. Our work is always client-focused, and our engineers are problem-solvers and innovators. We seek exceptional technologists to help deliver solutions on our user-facing applications, data stores, and reporting and metric platforms while being cloud-centric, leveraging multi-tier architectures, and aligned with our DevOps and Agile strategies. We are modernizing our technology stack across multiple platforms to build scalable, front-to-back assessment, measurement, and monitoring systems using the latest cloud, web, and data technologies. We are looking for someone with a systematic problem-solving approach, coupled with a sense of ownership and drive. The successful candidate will be able to influence and collaborate globally, be a strong team player, have an entrepreneurial approach, push innovative ideas while considering risk, and adapt in a fast-paced changing environment.

Role Summary:

As a Senior Cloud Data Warehouse Engineer, you will be a member of the C3 Data Warehouse team focusing on building our next-gen data platform used for sourcing and storing data from various technology systems across the firm into a centralized data platform. This platform will empower reporting and analytics solutions for the Technology Risk functions within the Company. You will primarily contribute to developing our Cloud Data Warehouse utilizing Snowflake and Python-based tooling. Responsibilities include designing and developing our data warehouse with Snowflake features such as data sharing, time travel, Snow Park, workload optimization, and handling structured and unstructured data. You will also work on integrating our Snowflake data warehouse with internal platforms for data quality, cataloging, discovery, incident logging, and metric generation. Collaboration with data warehousing leads, data analysts, ETL developers, infrastructure engineers, and data analytics teams is essential to facilitate the implementation of this data platform and pipeline framework.

KEY RESPONSIBILITIES:
  1. Design, develop, and manage our Snowflake data warehouse.
  2. Establish best practices for optimal and efficient Snowflake usage with tools like Airflow, DBT, and Spark.
  3. Assist with testing and deploying data pipeline frameworks using standard testing frameworks and CI/CD tools.
  4. Monitor query and data load performance, performing tuning as necessary.
  5. Provide assistance during QA & UAT phases to confirm issues' validity, determine root causes, and resolve them effectively.
SKILLS / QUALIFICATIONS:
  • Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field.
  • Minimum 10 years of experience in data development and solutions in complex data environments with large data volumes.
  • At least 7 years of SQL / PLSQL experience with the ability to write complex queries for data analysis.
  • At least 5 years of experience with Snowflake data solutions.
  • At least 3 years of developing data pipelines and warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark.
  • Experience in hybrid data environments (on-premises and cloud).
  • Hands-on Python experience is essential.
  • Hands-on experience with Airflow or similar tools like Dagster.
  • Snowflake SnowPro Core certification is required.
  • SnowPro Advanced Architect and Data Engineer certifications are a plus.
  • Experience with DBT is advantageous.
  • Skills in performance tuning SQL queries, Spark jobs, and stored procedures.
  • Understanding of E-R data models and advanced data warehouse concepts.
  • Strong analytical, communication, and collaboration skills.
  • Proven ability to manage multiple projects independently in a dynamic environment.
  • Excellent problem-solving skills to clarify business objectives and requirements.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.