Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Psybergate

Gauteng

On-site

ZAR 600 000 - 900 000

Full time

13 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

An IT solutions provider in South Africa seeks a Senior Data Engineer to design and maintain scalable data pipelines. The ideal candidate has 4-7 years of experience and proficiency in Python, SQL, and AWS. Responsibilities include developing robust data solutions, optimizing data flows, and collaborating with stakeholders. This role promotes a data-driven culture and involves mentoring junior engineers. A degree in a relevant field is essential.

Qualifications

  • 4 – 7 years of hands‑on data engineering experience.
  • Advanced proficiency in Python and SQL.
  • Experience building and maintaining cloud-based data architecture, preferably AWS.
  • Strong experience with BI tools like Power BI for data storytelling.

Responsibilities

  • Design, develop and maintain complex data pipelines from multiple sources into a central data platform.
  • Ensure reliability, scalability, and maintainability of data pipelines.
  • Lead initiatives for automation, monitoring, and CI / CD for data engineering workflows.
  • Translate business requirements into technical specifications.

Skills

Python
SQL
Agile methodologies
Database design
Data modeling
Data storytelling

Education

Degree or diploma in Computer Science, Information Systems, Engineering or related field

Tools

AWS
Power BI
Microsoft SQL Server
Oracle
MongoDB
Amazon S3
Git
Job description

Psybergate is an IT company that builds bespoke software solutions and provides highly skilled resources to its clients. We are looking for a Senior Data Engineer - AWS to join our financial services client based in Centurion.

The Senior Data Engineer is responsible for designing, building, and maintaining robust, scalable data pipelines and platforms to support advanced analytics, BI, and data-driven decision‑making.

This individual brings strong technical experience, shows leadership in data initiatives, and works closely with both technical and business stakeholders to ensure high‑quality data solutions.

What you will be doing
  • Design, develop and maintain complex data pipelines from multiple sources into a central data platform / lakehouse.
  • Ensure reliability, scalability, and maintainability of pipelines.
  • Optimize data flows and data quality checks.
  • Contribute to the architectural design and enhancements of the data platform.
  • Support the implementation of cloud‑first data solutions, primarily in AWS.
  • Lead initiatives for automation, monitoring, and CI / CD for data engineering workflows.
  • Provide technical guidance and mentorship to Data Engineers.
  • Advocate for best practices in data engineering, including version control, testing, and documentation.
  • Conduct code reviews and support knowledge sharing across the team.
  • Collaborate with data scientists, analysts, software engineers, and business stakeholders.
  • Translate business requirements into technical specifications and deliverables.
  • Support data consumers by developing reports, data products, and self‑service solutions.
  • Keep up to date with new tools, technologies, and best practices in data engineering.
  • Evaluate and integrate new data sources and technologies as required.
  • Champion a data‑driven culture within the organisation.
What we are looking for
  • Completed degree or diploma in Computer Science, Information Systems, Engineering, or a related field.
  • 4 – 7 years of hands‑on data engineering experience.
  • Advanced proficiency in Python and SQL.
  • Strong database design knowledge and experience with data warehousing techniques and modelling approaches.
  • Experience building and maintaining cloud‑based data architecture (AWS preferred).
  • Hands‑on experience with data ingestion from, amongst others: Microsoft SQL Server, Oracle, MongoDB, Amazon S3 and other AWS data services, HTTP APIs, SFTP, and various file systems.
  • Proficiency with Git, CI / CD pipelines, and Agile methodologies.
  • Familiarity with machine learning workflows and supporting analytics teams.
  • Strong experience with BI tools like Power BI for data storytelling.

Please note that if you do not hear from us within 3 weeks, consider your application unsuccessful.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.