Job Search and Career Advice Platform

Enable job alerts via email!

Expert Data Engineer

Sabenza IT & Recruitment

Pretoria

On-site

ZAR 500 000 - 800 000

Full time

30+ days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading recruitment company seeks a Data Engineer to manage extensive data provisioning and governance. The successful candidate will build and maintain robust Big Data pipelines on cloud platforms, ensuring compliance and security in data sharing operations. Required skills include Terraform, Python, AWS, and SQL, with a degree in IT or Engineering preferred. This position offers opportunities for innovation and mentorship in a collaborative environment.

Qualifications

  • Demonstrated experience developing technical documentation and artefacts.
  • Experience with enterprise collaboration tools.

Responsibilities

  • Building and maintaining large-scale Big Data pipelines on cloud-based data platforms.
  • Ensuring secure and compliant data sharing aligned with information classification standards.
  • Supporting enterprise Data & Analytics initiatives.

Skills

Terraform
Python 3.x
Docker
SQL (Oracle / PostgreSQL)
AWS

Education

Relevant IT, Business, or Engineering Degree

Tools

AWS Glue
Kafka
AWS EMR
Job description
Job Description

About the Role :

The Data Engineer will work on enterprise-wide data provisioning, spanning multiple data governance domains and data assets. Responsibilities include ensuring secure data sharing, adhering to protection and compliance requirements, supporting enterprise Data & Analytics initiatives (including high-priority use cases), and enabling data provisioning for operational processes.

Role Responsibilities :

Data Engineers in this environment are custodians of critical data assets and pipelines. Responsibilities include :

  • Building and maintaining large-scale Big Data pipelines on cloud-based data platforms
  • Ensuring secure and compliant data sharing aligned with information classification standards
  • Supporting enterprise Data & Analytics initiatives and high-priority use cases
  • Continuously improving and automating data engineering processes
  • Evaluating emerging tools and technologies to drive innovation
  • Mentoring and upskilling team members
  • Maintaining high-quality technical documentation
Requirements
Essential Skills :

Candidates must demonstrate strong, above-average expertise in :

Cloud & Infrastructure

  • Terraform
  • Docker
  • Linux / Unix
  • CloudFormation
  • CodeBuild / CodePipeline
  • CloudWatch
  • SNS
  • S3
  • Kinesis Streams (Kinesis, Firehose)
  • Lambda
  • DynamoDB
  • Step Functions
  • Parameter Store
  • Secrets Manager

Programming & Data Engineering

  • Python 3.x
  • SQL (Oracle / PostgreSQL)
  • PySpark
  • Boto3
  • ETL development
  • Big Data platforms
  • PowerShell / Bash

Data Platforms & Tools

  • Glue
  • Athena
  • Technical data modelling & schema design (hands‑on, not drag‑and‑drop)
  • Kafka
  • AWS EMR
  • Redshift

Business & Analytics

  • Business Intelligence (BI) experience
  • Strong data governance and security understanding
Advantageous Skills :
  • Advanced data modelling expertise, especially in Oracle SQL
  • Strong analytical skills for large, complex datasets
  • Experience with testing, data validation, and transformation accuracy
  • Excellent documentation, written, and verbal communication skills
  • Ability to work independently, multitask, and collaborate within teams
  • Experience building data pipelines using AWS Glue, Data Pipeline, or similar
  • Familiarity with AWS S3, RDS, and DynamoDB
  • Solid understanding of software design patterns
  • Experience preparing technical specifications, designing, coding, testing, and debugging solutions
  • Strong organisational abilities
  • Knowledge of Parquet, AVRO, JSON, XML, CSV
  • Experience with Data Quality tools such as Great Expectations
  • Experience working with REST APIs
  • Basic networking knowledge and troubleshooting skills
  • Understanding of Agile methodologies
  • Experience with documentation tools such as Confluence and JIRA
Qualifications & Experience
  • Relevant IT, Business, or Engineering Degree
  • Experience developing technical documentation and artefacts
  • Experience with enterprise collaboration tools
Preferred Certifications
  • AWS Cloud Practitioner
  • AWS SysOps Associate
  • AWS Developer Associate
  • AWS Architect Associate
  • AWS Architect Professional
  • HashiCorp Terraform Associate

Requirements

Terraform, Python, Docker, AWS, SQL

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.