Enable job alerts via email!

Google Cloud Developer

PBT Group

Cape Town

On-site

ZAR 500,000 - 700,000

Full time

16 days ago

Job summary

A leading technology company in Cape Town is seeking a Google Cloud Developer to design and maintain data pipelines on GCP. The ideal candidate will have a Bachelor's Degree and over 3 years of experience in Data Engineering, strong skills in SQL and Python, and familiarity with GCP technologies. Join a dynamic team and shape the future of data solutions in the cloud.

Benefits

Work on cutting-edge GCP technologies
Collaborative team environment
Focus on innovation and growth

Qualifications

  • 3+ years of hands-on Data Engineering experience, preferably on GCP.
  • Strong skills in SQL and Python for data transformation and automation.
  • Bonus points for experience with Kafka/streaming, CI/CD tools like Docker and Terraform.

Responsibilities

  • Design, build, and maintain scalable data pipelines and ETL processes on GCP.
  • Manage and optimise data warehouses and lakes.
  • Develop and enhance data models for analytics, reporting, and machine learning.

Skills

SQL
Python
Data modelling
ETL processes

Education

Bachelor's Degree in Computer Science, Information Systems, Engineering, or related field

Tools

BigQuery
Cloud Dataflow
Cloud Storage
Cloud Composer
Job description
Google Cloud Developer required in Cape Town.

Are you passionate about building powerful data pipelines, optimising architectures, and enabling advanced analytics and machine learning? We’re looking for an experienced GCP Data Engineer to join our dynamic team and help shape the future of data in the cloud.

In this role, you’ll work closely with data analysts, data scientists, and business stakeholders to design and deliver scalable, reliable, and secure data solutions on Google Cloud Platform.

If you love solving complex data challenges and want to make an impact in a fast-paced, innovative environment, this is the role for you!

What You’ll Do:

  • Design, build, and maintain scalable data pipelines and ETL processes on GCP.
  • Manage and optimise data warehouses and lakes using tools like BigQuery, Cloud Dataflow, Cloud Storage, and Cloud Composer.
  • Develop and enhance data models that power analytics, reporting, and machine learning.
  • Partner with cross-functional teams to turn business needs into tailored data solutions.
  • Conduct performance tuning, data quality checks, and security best practices.
  • Document and maintain data architectures and processes for long-term success.

What We’re Looking For:

  • Bachelor’s Degree in Computer Science, Information Systems, Engineering, or related field.
  • 3+ years of hands-on Data Engineering experience (preferably on GCP).
  • Strong skills in SQL and Python for data transformation and automation.
  • Expertise with BigQuery, Cloud Dataflow, Cloud Storage, Cloud Composer, Pub/Sub, and Cloud Functions.
  • Solid knowledge of data modelling, ETL, and data warehousing.
  • Bonus points for: experience with Kafka/streaming, CI/CD tools (Docker, Terraform, Kubernetes), or GCP certifications.

What Sets You Apart:

  • Analytical mindset with strong problem-solving skills.
  • Keen eye for data accuracy, reliability, and security.
  • Strong communicator who thrives in collaborative environments.
  • Self-driven, adaptable, and always ready to learn new tech.

Why Join Us?

  • Work on cutting-edge GCP technologies in a forward-thinking organisation.
  • Collaborate with a team that values innovation, impact, and growth.
  • Play a key role in shaping our data strategy and supporting machine learning initiatives.
  • Be part of a culture that celebrates curiosity, collaboration, and success.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.