Job Search and Career Advice Platform

Enable job alerts via email!

Product Support Engineer - Google Cloud Platform

Kount

Toronto

On-site

CAD 70,000 - 85,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology company in Toronto is seeking a Product Support Engineer to manage their batch data delivery platform based on Google Cloud. The role involves configuring workflows, managing data pipelines, and ensuring data quality for clients. Candidates should hold a degree in computer science and possess strong analytical skills, with familiarity in GCP and API tools. This position offers the opportunity to work in a collaborative environment focused on continuous improvement and client satisfaction.

Qualifications

  • Bachelor's degree in computer science, information technology or equivalent.
  • Strong analytical, detail-oriented, and proactive approach to problem solving.
  • Familiarity with data processing concepts and batch delivery systems.
  • Experience working with configuration files (e.g., YAML).
  • Understanding of cloud computing concepts, ideally with some exposure to GCP.

Responsibilities

  • Configure and execute data processing workflows on the batch delivery platform.
  • Deploy and manage complex data pipelines using Apache Airflow.
  • Monitor the execution of data processing jobs and ensure timely delivery.
  • Perform data quality checks to ensure integrity and accuracy.

Skills

Strong analytical skills
Detail-oriented
Proactive problem solving
Excellent communication skills

Education

Bachelor's degree in computer science or equivalent

Tools

GCP (Google Cloud Platform)
Apache Airflow
YAML
API testing tools
Python
Job description

Synopsis of the role

As a Product Support Engineer within the Technical Fulfillment team, you’ll configure, operate and monitor our high-volume batch data delivery platform built on Google Cloud. You will play a key role in processing and enriching financial data for our clients, ensuring seamless and precise execution with a strong focus on quality and reliability gaining invaluable experience in a dynamic and evolving technological landscape.

What you will do

Platform Configuration and Operation:

  • Configure and execute data processing workflows on our batch delivery platform according to established procedures and client specifications.

  • Deploy and manage complex data pipelines orchestrated via Apache Airflow.

  • Leverage and extend Standard Airflow DAG Templates to ensure deployment consistency and rapid iteration across environments.

  • Monitor the execution of data processing jobs, identifying and addressing any issues to ensure timely and accurate delivery.

  • Perform data quality checks and validation to ensure the integrity and accuracy of processed information.

  • Diagnose and resolve complex configuration, scheduling, and dependency issues within the GCP platform and its associated infrastructure.

  • Adhere to established security protocols and maintain the highest standards of data security and privacy in all configuration activities.

  • Follow SDLC guidelines for any configuration deployments or changes.

  • Maintain accurate records of configurations, processes, and any issues encountered.

  • Stay up-to-date with platform updates, new features, and relevant technical information.

  • Collaborate effectively with other team members, including configuration Specialists, and provide support as needed.

  • Continuous Improvement and Documentation:

  • Contribute to the identification of opportunities for process improvement and automation within the configuration workflows.

  • Assist in the creation and maintenance of our configuration documentation and knowledge base articles.

  • Participate in team meetings and contribute to discussions on process optimization and best practices.

Client Focus:

  • Understand the importance of accurate and timely data delivery for our clients.

  • Support the team in addressing client inquiries and resolving fulfillment-related issues.

  • Take ownership of precise, high-stakes data configurations that directly impact client trust and business outcomes.

  • Collaborate with cross-functional partners (including Engineering and Data Operations) to resolve issues and improve delivery processes.

What experience you need

  • Bachelor's degree in computer science, information technology or equivalent.

  • Strong analytical, detail-oriented, and proactive approach to problem solving.

  • Familiarity with data processing concepts and batch delivery systems.

  • Experience working with configuration files (e.g., YAML).

  • Understanding of cloud computing concepts, ideally with some exposure to GCP.

  • Familiarity with different file encoding formats.

  • Experience with API tools for testing and understanding data interfaces.

  • Basic scripting skills (e.g., shell scripting) are an asset.

  • Excellent written and verbal communication skills in English.

  • Ability to follow team procedures accurately and manage multiple priorities effectively.

  • Ability to work effectively both independently and as part of a collaborative team.

What could set you apart

  • Ability to verify issues by testing in higher environments, be on-call support for Production Incidents as needed.

  • Experience with GCP services such as Dataflow, Composer, Cloud Storage, or Big Query.

  • Familiarity with monitoring and logging tools within a cloud environment (e.g.: Google Cloud Logging and Monitoring).

  • Exposure to financial services data and an understanding of its sensitivity and importance.

  • Proficiency in scripting (Python) or object-oriented programming languages (Java).

Primary Location: CAN-Toronto-5700 Yonge CAN-Montreal

Function: Function - Tech Dev and Client Services

Schedule: Full time

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.