Job Search and Career Advice Platform

Enable job alerts via email!

Full-Stack Data Engineer (Sa Remote)

Niva Health

Remote

ZAR 300 000 - 400 000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A health-tech company in South Africa is seeking a Data Engineer to build and maintain data solutions that enhance reporting and decision-making. The role requires expertise in Google Cloud Platform, particularly BigQuery, as well as proficiency in Python and SQL for data processing. You will design data pipelines, automate workflows, and create dashboards using Looker Studio. Candidates with 2+ years of experience in data engineering or related fields are encouraged to apply. The salary is competitive and aligned to South African market rates.

Qualifications

  • 2+ years of experience in data engineering, analytics engineering, data science, or software engineering.
  • Confident in Python for data processing and automation.
  • Solid SQL skills and understanding of data modelling basics.

Responsibilities

  • Design, build, and maintain data solutions for reporting and decision-making.
  • Build and maintain data pipelines using Google Cloud Platform.
  • Automate ETL/ELT workflows to improve reliability and efficiency.

Skills

Data engineering experience
GCP (Google Cloud Platform)
Python
SQL
Data pipeline maintenance
Looker Studio

Tools

Google Cloud Platform
Looker Studio
Apache Airflow
Job description

This role is for you if you enjoy building real data solutions end-to-end — not just one piece of the puzzle.

What you'll be working on

You'll help design, build, and maintain data solutions that power reporting and decision-making across the business.

That includes :

  • Building and maintaining data pipelines using Google Cloud Platform (BigQuery, Cloud Functions, Cloud Composer, Cloud Scheduler).
  • Cleaning, transforming, and organising data from multiple sources (APIs, spreadsheets, internal systems).
  • Automating ETL / ELT workflows to improve reliability and efficiency.
  • Writing Python (and some Bash) scripts to support data processing and internal tools.
  • Building and maintaining dashboards and KPI reports using Looker Studio (and supporting data visualisation needs).
  • Preparing datasets for simple predictive or forecasting use cases as the team evolves.

This is a hands‑on role— you'll be writing code, fixing issues, improving pipelines, and seeing your work used by real teams.

You'll be a great fit if
  • You have 2+ years' experience in data engineering, analytics engineering, data science, or software engineering.
  • You're comfortable working with GCP, especially BigQuery.
  • You use Python confidently for data processing and automation.
  • You have solid SQL skills and understand data modelling basics.
  • You've built or maintained data pipelines before (batch or streaming).
  • You've worked with dashboards or BI tools (Looker / Looker Studio preferred).
  • You enjoy working across both technical backend tasks and user-facing reporting.
Nice to have
  • Experience with Apache Airflow / Cloud Composer.
  • Exposure to Apache Beam.
  • Familiarity with Vertex AI, AutoML, or basic ML workflows.
  • Experience supporting operational or healthcare data.
Salary

Competitive and aligned to South African remote market rates.

Final offer will depend on experience and technical depth.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.