Enable job alerts via email!

2063 AWS Data Engineer

JorDan HR

Centurion

Hybrid

ZAR 600,000 - 900,000

Full time

21 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in the automobile industry is seeking a Senior AWS Data Engineer to build and maintain Big Data Pipelines. The ideal candidate will have a strong background in AWS technologies, data engineering, and relevant qualifications. This position offers flexible working hours and a dynamic team environment, making it an exciting opportunity for professionals looking to grow in their careers.

Benefits

Flexible working hours
High work-life balance
Remote/on-site flexibility
Modern offices
Dynamic team environment
Application of Agile methodologies

Qualifications

  • Experience with AWS components like Glue, CloudWatch, Lambda, and DynamoDB.
  • Ability to develop technical documentation and knowledge of data formats.
  • Experience with building data pipelines and data validation.

Responsibilities

  • Build and maintain Big Data Pipelines.
  • Ensure data sharing aligns with classification requirements.
  • Identify process improvements and mentor team members.

Skills

Terraform
Python 3.x
PySpark
Boto3
Docker
Powershell / Bash
Data quality tools
REST APIs

Education

Relevant IT, Business, or Engineering Degree
AWS Certified Cloud Practitioner
Hashicorp Terraform Associate

Tools

AWS Glue
AWS EMR
Redshift
Confluence
JIRA

Job description

A client in the automobile industry is seeking a Senior AWS Data Engineer.

Product / Feature Team Information (if applicable)

Our Domain Enterprise Data Services Platforms include various Data Business Objects such as Quality Motorbike Purchasing, Supplier Marketing & Research, Finance, Sales, Customer Support, ITO, and OTD.

ESSENTIAL SKILLS REQUIREMENTS :

Above average experience / understanding (in order of importance):

Technical Skills / Technology:

  • Terraform
  • Python 3.x
  • PySpark
  • Boto3
  • Docker
  • Powershell / Bash

Basic experience / understanding of AWS Components (in order of importance):

  • Glue
  • CloudWatch
  • Lambda
  • DynamoDB
  • Step Functions
  • Parameter Store
  • Secrets Manager
  • CodeBuild / Pipeline
  • CloudFormation
  • Business Intelligence (BI) Experience
  • Technical data modelling and schema design (not drag and drop)
  • Kafka

Additional technologies include AWS EMR, Redshift, and experience with enterprise collaboration tools like Confluence and JIRA. Candidates should also have experience developing technical documentation, knowledge of data formats (Parquet, AVRO, JSON, XML, CSV), and working with data quality tools such as Great Expectations. Experience with REST APIs and networking troubleshooting is a plus.

ADVANTAGEOUS SKILLS REQUIREMENTS :

  • Exceptional analytical skills for large and complex data sets
  • Thorough testing and data validation skills
  • Strong communication and documentation skills
  • Self-driven with the ability to work independently and multitask
  • Experience building data pipelines using AWS Glue or similar platforms
  • Familiarity with AWS S3, RDS, or DynamoDB
  • Understanding of software design patterns
  • Ability to prepare detailed specifications for development
  • Strong organizational skills

QUALIFICATIONS / EXPERIENCE :

  • Relevant IT, Business, or Engineering Degree
  • Certifications such as AWS Certified Cloud Practitioner, SysOps, Developer, Architect, or Hashicorp Terraform Associate are preferred.

ROLE AND RESPONSIBILITIES :

Data Engineers will be responsible for building and maintaining Big Data Pipelines, ensuring data sharing aligns with classification requirements, and staying updated with industry trends. They will identify process improvements, explore new technologies, and mentor team members.

WHAT WE OFFER :

  • Cutting-edge global IT systems and processes
  • Flexible working hours (1960 hours/year)
  • High work-life balance
  • Remote/on-site flexibility
  • Vehicle promotions (terms apply)
  • Modern offices and dynamic team environment
  • Application of Agile methodologies

KEY SKILLS :

Apache Hive, S3, Hadoop, Redshift, Spark, AWS, Apache Pig, NoSQL, Big Data, Data Warehouse, Kafka, Scala

Employment Type : Full-Time

Experience : [Specify years]

Vacancy : 1

Create a job alert for this search

Data Engineer • Centurion, Gauteng, South Africa

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.