Job Search and Career Advice Platform

Enable job alerts via email!

GCP Cloud Engineer

Compunnel, Inc.

Minneapolis (MN)

On-site

USD 90,000 - 130,000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading company is seeking a GCP Cloud Engineer to design, develop, and maintain scalable data pipelines using Google Cloud services. The ideal candidate will possess strong problem-solving skills, and adaptability, and experience with data engineering tools like Python, SQL, and PySpark. This role involves close collaboration with cross-functional teams to ensure high-quality data products and support data warehousing initiatives. If you're eager to learn and excel in a dynamic environment, this position offers a great opportunity for professional growth.

Qualifications

  • Hands-on experience with Google BigQuery, Dataflow, Dataplex.
  • Proficiency in Python, SQL, PySpark.
  • Strong problem-solving, analytical, and communication skills.

Responsibilities

  • Design, develop, and maintain data pipelines using GCP services.
  • Collaborate with senior data engineers for implementing cloud solutions.
  • Document technical processes and maintain organized documentation.

Skills

Problem-solving
Communication
Analytical skills
Adaptability

Tools

Python
SQL
PySpark
SnapLogic
Google BigQuery
Google Dataflow
Google Dataplex
Jira
Job description

We are seeking a GCP Cloud Engineer with hands-on experience in Google Cloud Platform services and modern data engineering tools.

The ideal candidate will be responsible for building and maintaining scalable data pipelines, supporting cloud-based data solutions, and collaborating with cross-functional teams to deliver high-quality data products.

A strong desire to learn, problem-solving skills, and adaptability are essential for success in this role.

Key Responsibilities

  • Design, develop, and maintain data pipelines using GCP services such as Cloud Composer, BigQuery, Dataflow, and Dataplex.
  • Write efficient and scalable code using Python, SQL, and PySpark.
  • Integrate and manage data workflows using SnapLogic and DataStage (nice to have).
  • Collaborate with senior data engineers and project teams to implement cloud-based data solutions.
  • Support data warehousing initiatives and ensure data quality and integrity.
  • Participate in Agile development processes and contribute to sprint planning and task tracking using Jira.
  • Document technical processes and maintain clear, organized project documentation.
Required Qualifications
  • Hands-on experience with:
  • Google BigQuery
  • Google Dataflow
  • Google Dataplex
  • Python, SQL, PySpark
  • SnapLogic
  • Strong problem-solving and analytical skills.
  • Ability to take direction from management and senior team members.
  • Excellent documentation and communication skills.
  • Strong desire and ability to learn new technologies and tools.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.