Enable job alerts via email!

Gcp Data Engineer (Snowflake)

TELUS International

United States

Remote

USD 100,000 - 130,000

Full time

Today
Be an early applicant

Job summary

A leading technology service provider in the United States is seeking a skilled Data Engineer to design and optimize data pipelines using GCP. The ideal candidate should have over 5 years of experience in GCP services and strong Python skills. Responsibilities include collaborating with teams to deliver data solutions and maintaining high-quality ETL processes. This role is a perfect fit for those looking to work in a dynamic, cross-functional environment.

Qualifications

  • 5+ years of experience with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub.
  • Proficient in data engineering development using Python.
  • Strong skills in debugging complex issues within data pipelines.

Responsibilities

  • Design and optimize data pipelines and ETL workflows using GCP.
  • Write, test, and maintain high-quality Python code for ETL.
  • Collaborate with cross-functional teams to deliver data solutions.

Skills

Python
SQL
GCP Services
Data Engineering
Git
Pulumi
Job description

Mandatory skill- Advance Python & SQL GCP Services- BigQuery, Dataflow, Dataproc and Pub/Sub.

Key Responsibilities
  • Design, develop and optimize scalable data pipelines and ETL workflows using Google Cloud Platform (GCP), particularly leveraging BigQuery, Dataflow, Dataproc and Pub/Sub.
  • Design and manage secure, efficient data integrations involving Snowflake and BigQuery.
  • Write, test and maintain high-quality Python code for data extraction, transformation and loading (ETL), analytics and automation tasks.
  • Use Git for collaborative version control, code reviews and managing data engineering projects.
  • Implement infrastructure-as-code practices using Pulumi for cloud resources management and automation within GCP environments.
  • Apply clean room techniques to design and maintain secure data sharing environments in alignment with privacy standards and client requirements.
  • Collaborate with cross-functional teams (data scientists, business analysts, product teams) to deliver data solutions, troubleshoot issues and assure data integrity throughout the lifecycle.
  • Optimize performance of batch and streaming data pipelines, ensuring reliability and scalability.
  • Maintain documentation on processes, data flows and configurations for operational transparency.
Required Skills
  • Strong hands-on experience of 5+ years with GCP core data services: BigQuery, Dataflow, Dataproc and Pub/Sub.
  • Proficiency in data engineering development using Python.
  • Deep familiarity with Snowflakedata modeling, secure data sharing and advanced query optimization.
  • Proven experience with Git for source code management and collaborative development.
  • Demonstrated ability using Pulumi (or similar IaC tools) for deployment and support of cloud infrastructure.
  • Practical understanding of cleanroom concepts in cloud data warehousing, including privacy/compliance considerations.
  • Solid skills in debugging complex issues within data pipelines and cloud environments.
  • Effective communication and documentation skills.
Great to Have
  • GCP certification (e.g., Professional Data Engineer).
  • Experience working in regulated environments (telecom/financial/healthcare) with data privacy and compliance focus.
  • Exposure to additional GCP services such as Cloud Storage, Cloud Functions or Kubernetes.
  • Demonstrated success collaborating in agile, distributed teams.
  • Experience with data visualization tools (e.g., Tableau, Looker) is nice to have.
  • EST Shift
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.