Job Search and Career Advice Platform

Enable job alerts via email!

Expert Big Data Engineer (Contract) GautengHybrid ISB1201223

iSanqa Resourcing

Midrand

Hybrid

ZAR 60 000 - 90 000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global automotive analytics leader is seeking a Data Engineer to design and implement large-scale data pipelines. This hybrid role in Midrand requires extensive experience in data engineering, cloud services, and technical leadership. You'll ensure data quality, mentor team members, and drive innovation through advanced data architecture. Ideal candidates have a relevant degree and technical skills in AWS and data processing languages. Exceptional analytical and collaborative skills are essential for success.

Qualifications

  • 8 years related experience in data engineering and Big Data pipelines.
  • Experience in working with Enterprise Collaboration tools like Confluence and JIRA.
  • Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV.

Responsibilities

  • Build and maintain Big Data pipelines; design, develop, and optimize ETL processes.
  • Ensure data quality checks and validation along with compliance and data governance.
  • Mentor team members and provide technical guidance on data architecture best practices.

Skills

Python 3.x
Pyspark
PowerShell/Bash
SQL (Oracle / PostgreSQL)
AWS Glue
Docker
Big Data
Business Intelligence (BI)

Education

Relevant IT / Business / Engineering Degree

Tools

Terraform
AWS Cloud Services
Kafka
Job description
Overview

Step into a high-impact Data Engineer role where you’ll architect and run large‑scale data pipelines across AWS and enterprise‑grade data platforms. This is your chance to shape the data backbone powering analytics operational systems and business intelligence for a global automotive leader.

Own complex data ingestion, transformation and provisioning streams. Champion security compliance and data quality across enterprise‑wide data assets. If you thrive in environments where data engineering meets strategy, this is your arena.

Position Details

Contract: 1 February 2026 – 31 December 2028

Experience: 8 years related experience

Commencement: 1 February 2026

Location: Hybrid – Midrand / Menlyn / Rosslyn / Home Office rotation

Team: Data Science and Engineering – Enterprise Data & Analytics

Minimum Mandatory Qualifications
  • Relevant IT / Business / Engineering Degree
Preferred Certifications
  • AWS Certified Cloud Practitioner
  • AWS Certified SysOps Associate
  • AWS Certified Developer Associate
  • AWS Certified Architect Associate
  • AWS Certified Architect Professional
  • HashiCorp Certified Terraform Associate
Minimum Mandatory Experience
  • Above average experience/understanding in data engineering and Big Data pipelines
  • Experience in working with Enterprise Collaboration tools such as Confluence JIRA
  • Experience developing technical documentation and artefacts
  • Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV
  • Knowledge of the Agile Working Model
Essential Skills
  • Programming & Scripting: Python 3.x, PySpark, PowerShell/Bash, Boto3
  • Infrastructure as Code: Terraform
  • Databases & Data Processing: SQL (Oracle / PostgreSQL), ETL, Big Data, technical data modelling and schema design
  • AWS Cloud Services: Group Cloud Data Hub (CDH), Group CDEC Blueprint, AWS Glue, CloudWatch, SNS, Athena, S3, Kinesis Streams/Firehose, Lambda, DynamoDB, Step Functions, Parameter Store, Secrets Manager, CodeBuild / Pipeline, CloudFormation, AWS EMR, Redshift
  • Big Data Technologies: Kafka
  • Containerization & Operating Systems: Docker, Linux/Unix
  • Analytics: Business Intelligence (BI) experience
  • Soft Skills: self‑driven team player, strong written and verbal communication, strong organizational skills, collaborative, problem‑solving, above‑board work ethics
Advantageous Skills
  • Expertise in data modeling (Oracle SQL)
  • Exceptional analytical skills for large and complex datasets
  • Data pipeline building using AWS Glue or Data Pipeline or similar platforms
  • Familiar with data stores such as AWS S3, AWS RDS or DynamoDB
  • Experience with software design patterns
  • Experience preparing specifications and designing, coding, testing, debugging programs
  • Experience with Data Quality Tools such as Great Expectations
  • Experience developing and working with REST APIs
  • Basic experience in Networking and troubleshooting network issues
Role Requirements
  • Data Pipeline Development: Build and maintain Big Data pipelines; design, develop and optimize ETL processes; implement pipelines using AWS Glue, Lambda, Step Functions and other AWS services
  • Data Governance & Security: Act as custodian of data; ensure data sharing aligns with information classification; ensure data protection, compliance, data quality checks and validation
  • Technical Leadership: Mentor, train and upskill team members; provide technical guidance on data architecture and best practices; review and approve technical designs
  • Innovation & Improvement: Stay current with latest data engineering tools; identify process improvements and automation opportunities; evaluate new technologies to drive innovation
  • Data Modelling & Architecture: Design and implement technical data models and schema designs; ensure scalability and performance; maintain data architecture documentation
  • Collaboration: Work with cross‑functional teams to gather requirements and deliver data solutions; collaborate with stakeholders; support Enterprise D&A Use Cases (TOP20) and operational processes
  • Testing & Quality Assurance: Perform thorough testing and data validation; implement automated testing frameworks; monitor pipeline performance and troubleshoot issues
  • Documentation: Develop technical documentation and artefacts; create and maintain runbooks and operational procedures; document data lineage and metadata
Additional Information

South African citizens / residents are preferred. Applicants with valid work permits will also be considered. By applying you consent to be added to the database and to receive updates until you unsubscribe. If you do not receive a response within 2 weeks please consider your application unsuccessful.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.