Job Search and Career Advice Platform

Enable job alerts via email!

Specialist Platform Engineer

Absa Group

Sandton

Hybrid

ZAR 500 000 - 800 000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading banking institution in Sandton is seeking an engineer to build and maintain Kafka-based streaming applications. The role blends development and platform responsibilities while working with distributed systems at scale. The ideal candidate should have strong programming skills in Java, Python, or .NET, possess a solid understanding of event-driven architectures, and have hands-on experience with Kafka, applying best security practices. This position offers the opportunity to contribute to a dynamic team in a hybrid working environment.

Responsibilities

  • Develop, maintain, and optimize Kafka-based applications.
  • Work with distributed systems and fault-tolerance.
  • Manage and secure Kafka clusters and deployments.
  • Automate infrastructure deployment using coding tools.
  • Maintain observability for Kafka clusters.

Skills

Java (Spring/Spring Boot)
Python
.NET
Distributed systems concepts
Event-driven architectures
Kafka in production
Infrastructure as code
Monitoring tools (e.g., Prometheus, Grafana)

Education

Bachelor's Degree in Information Technology

Tools

Terraform
Ansible
Docker
Kubernetes
Confluent Cloud
Job description
Empowering Africa’s tomorrow, together…one story at a time.

With over 100 years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.

Job Summary

We are looking for an engineer to join our team to help build and maintain Kafka-based streaming applications and support the Kafka platform across on-prem and Confluent Cloud environments. The role is a hybrid of development, platform responsibilities, and observability, providing a unique opportunity to work on distributed systems at scale.

Job Description
Core Responsibilities
  • Develop, maintain, and optimize Kafka-based applications and event streaming pipelines using Java(Spring / Spring Boot), Python, or .NET.
  • Work with distributed systems concepts: partitions, replication, fault-tolerance, scaling, and event-driven architectures.
  • Contribute to provisioning, managing, and securing Kafka clusters both on-prem and in Confluent Cloud.
  • Implement and maintain security and authorization mechanisms, including ACLs, Kerberos, SSL, and OAuth for Confluent Cloud.
  • Automate infrastructure deployment and configuration using Terraform, Ansible, CloudFormation, Docker, or Kubernetes.
  • Configure, monitor, and maintain observability for Kafka clusters, including metrics, alerts, and dashboards (e.g., Prometheus, Grafana, Confluent Control Center, ElasticSearch).
  • Assist in troubleshooting production issues and perform root cause analysis.
  • Collaborate closely with developers, DevOps/SRE teams, and other stakeholders to ensure reliable and performant streaming systems.
  • Contribute to best practices for connector configuration, high availability, disaster recovery, and performance tuning, including streaming applications and pipelines built with Kafka Streams, ksqlDB, Apache Flink, and TableFlow.
Required Skills
  • Strong programming experience in Java(Spring / Spring Boot), Python, or .NET. Ability to write clean, maintainable, and performant code.
  • Solid understanding of distributed systems principles and event-driven architectures.
  • Hands-on experience with Kafka in production or strong ability to learn quickly.
  • Knowledge of Kafka ecosystem components (Connect, Schema Registry, KSQL, MirrorMaker, Control Center, Kafka Streams, Apache Flink, TableFlow) is a plus.
  • Familiarity with security best practices for Kafka, including ACLs, Kerberos, SSL, and OAuth.
  • Experience with infrastructure as code and containerized environments.
  • Experience with monitoring and observability tools for distributed systems.
Desirable Skills / Bonus Points
  • Experience with Confluent Cloud or other managed Kafka platforms.
  • Experience with AWS.
  • Experience building streaming pipelines across multiple systems and environments.
  • Familiarity with CI/CD pipelines and automated deployments.
Behavioral / Soft Skills
  • Strong problem-solving and analytical skills.
  • Excellent communication and interpersonal skills.
  • Ability to work independently and prioritize across multiple BAU and project tasks.
  • Product-minded approach, focusing on delivering value and scalable solutions.
Education

Bachelor's Degree: Information Technology

Absa Bank Limited is an equal opportunity, affirmative action employer. In compliance with the Employment Equity Act 55 of 1998, preference will be given to suitable candidates from designated groups whose appointments will contribute towards achievement of equitable demographic representation of our workforce profile and add to the diversity of the Bank.

Absa Bank Limited reserves the right not to make an appointment to the post as advertised

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.