Enable job alerts via email!

2063 Aws Data Engineer

Jordan Hr

Gauteng

Hybrid

ZAR 300,000 - 600,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a talented Data Engineer to join their dynamic team. This role involves building and maintaining Big Data pipelines, ensuring data integrity, and exploring innovative data engineering approaches. You'll have the chance to work with cutting-edge technologies and mentor team members, all while enjoying a flexible working environment that promotes work-life balance. If you're passionate about data and eager to drive efficiency through automation, this opportunity is perfect for you. Join a forward-thinking team that values your contributions and fosters professional growth.

Benefits

Flexible working hours
High work-life balance
Remote / On-site work flexibility
Affordable group vehicle promotions
Modern, state-of-the-art offices
Dynamic global team collaboration
Application of Agile Working Model

Qualifications

  • Experience in building and maintaining Big Data pipelines.
  • Strong analytical skills and ability to work with large datasets.

Responsibilities

  • Build and maintain data pipelines using various data platforms.
  • Ensure data is shared according to classification requirements.

Skills

Terraform
Python
SQL - Oracle/PostgreSQL
Py Spark
Boto3
ETL
Docker
Linux / Unix
Big Data
Powershell / Bash

Education

Relevant IT / Business / Engineering Degree

Tools

AWS Glue
AWS RDS
AWS S3
Kafka
AWS EMR
Redshift
Confluence
JIRA

Job description

Product / Feature Team Information

Our Domain, Enterprise Data Services, Platforms, comprises a number of Data Business Objects, namely, Quality, Motorbike, Purchasing & Supplier, Marketing & Research, Finance, Sales, Customer, Customer Support, ITO, and OTD.

Essential Skills Requirements

Above average experience / understanding (in order of importance):

  • Technical Skills / Technology: Terraform, Python 3.x, SQL (Oracle / PostgreSQL), PySpark, Boto3, ETL, Docker, Linux / Unix, Big Data, PowerShell / Bash, basic understanding of AWS components (in order of importance): Glue, CloudWatch, SNS, Athena, S3, Kinesis Streams (Kinesis, Kinesis Firehose), Lambda, DynamoDB, Step Function, Param Store, Secrets Manager, Code Build / Pipeline, CloudFormation, Business Intelligence (BI) Experience, Technical data modelling and schema design (
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.