Enable job alerts via email!

Data Engineer - 2053

JorDan HR

Pretoria

On-site

ZAR 300,000 - 400,000

Full time

Yesterday
Be an early applicant

Job summary

A reputable global company based in Gauteng is seeking a skilled Data Engineer. The ideal candidate will possess a degree and be a Certified AWS Cloud Practitioner. Key skills include proficiency in Terraform, Python, SQL, and experience with data formats and APIs. This full-time role involves designing and validating data processes to ensure accuracy and efficiency. Interested candidates should send their CV for consideration.

Qualifications

  • Familiarity with data formats like Parquet, AVRO, JSON, XML, CSV.
  • Strong analytical skills with large and complex data sets.
  • Experience building data pipelines using AWS Glue or similar.

Responsibilities

  • Design, code, test, and debug programs based on specifications.
  • Perform thorough testing and data validation to ensure data accuracy.
  • Document processes with precise written communication.

Skills

Terraform
Python 3.x
SQL - Oracle / PostgreSQL
Py Spark
Boto3
Docker
Linux / Unix
Big Data
Powershell / Bash

Education

Degree
Certified AWS Cloud Practitioner or similar
Job description
Overview

Data Engineer with Degree and Certified AWS Cloud Practitioner or similar and knowledge of data formats such as Parquet AVRO JSON XML CSV etc. and REST APIs required for a reputable global company based in Gauteng.

Technical Skills / Technology
  • Terraform
  • Python 3x
  • SQL - Oracle / PostgreSQL
  • Py Spark
  • Boto3
  • Docker
  • Linux / Unix
  • Big Data
  • Powershell / Bash
  • Technical Data Modelling and schema design
  • Demonstrate expertise in data modelling Oracle SQL
  • Exceptional analytical skills analysing large and complex data sets
  • Perform thorough testing and data validation to ensure the accuracy of data transformations
  • Strong written and verbal communication skills with precise documentation
  • Self-driven team player with ability to work independently and multi-task
  • Experience building data pipeline using AWS Glue or Data Pipeline or similar platforms
  • Familiar with data store such as AWS S3 and AWS RDS or DynamoDB
  • Experience and solid understanding of various software design patterns
  • Experience preparing specifications from which programs will be written designed coded tested and debugged

Send CV to : emailprotected

If you have not received a reply within 7 days consider your application unsuccessful

Key Skills

Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Employment Details

Employment Type : Full-Time

Experience : years

Vacancy : 1

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.