Enable job alerts via email!

Data Engineer

Paracon

Johannesburg

Hybrid

ZAR 600,000 - 850,000

Full time

19 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading financial services and insurance organization in Johannesburg seeks an experienced Data Engineer. The candidate will focus on building and maintaining robust data pipelines using modern DevOps practices, collaborating with teams on AWS services, Docker, and Kubernetes. This role offers an opportunity to enhance the company's data-driven decision-making capabilities.

Qualifications

  • 5+ years of experience in data engineering and DevOps integration.
  • Strong knowledge of Apache Spark, Kafka, and AWS tools.
  • Excellent troubleshooting and communication skills.

Responsibilities

  • Design, develop, and deploy scalable data pipelines.
  • Collaborate with teams to manage CI/CD pipelines.
  • Monitor data pipelines for performance and reliability.

Skills

DevOps
Apache Spark
Kafka
AWS
Docker
Kubernetes

Education

Bachelor's degree in Computer Science

Tools

Apache Spark
Kafka
AWS DevOps tools
Docker
Kubernetes

Job description

Data Engineer

A leading financial services and insurance organization is seeking an experienced Data Engineer with a strong focus on DevOps to join its Data Engineering Department. This role is instrumental in designing, building, and maintaining robust, scalable data pipelines and systems that support data-driven decision-making across the enterprise.

You will work closely with cross-functional teams to enable seamless integration with AWS services, drive containerization through Docker and Kubernetes, and manage performance of Apache Spark and Kafka deployments. If you're passionate about optimizing data operations and leveraging modern DevOps and cloud practices, we'd like to meet you.

Key Responsibilities:

  • Design, develop, and deploy scalable and efficient data pipelines tailored to business needs.
  • Collaborate with the DevOps team to build and manage CI/CD pipelines using AWS tools such as CodePipeline, CodeBuild, and CodeDeploy.
  • Containerize and orchestrate data processing applications using Docker and Kubernetes.
  • Manage and optimize deployments of Apache Spark and Kafka to support high-performance data processing.
  • Monitor data pipelines for performance, reliability, and error reduction.
  • Apply security and compliance best practices in all data engineering workflows.
  • Evaluate and introduce new tools and technologies to improve data engineering productivity and system performance.
  • Provide technical support to team members, resolving issues related to data workflows and infrastructure.
Required Skills & Qualifications:

  • Bachelor's degree in Computer Science, Software Engineering, or a related field.
  • Minimum 5+ years of experience in data engineering with hands-on experience in pipeline development and DevOps integration.
  • Strong knowledge and hands-on experience with:
    • Apache Spark
    • Databricks
    • Oracle
    • Kafka deployment and performance tuning
  • Proficiency in AWS DevOps tools including CodePipeline, CodeBuild, CodeDeploy, and CodeStar.
  • Experience using containerization and orchestration tools (Docker and Kubernetes).
  • Familiarity with other data processing frameworks such as Hadoop, Apache NiFi, or Apache Beam.
  • Excellent troubleshooting and problem-solving skills.
  • Strong communication skills, with the ability to bridge the gap between technical teams and business stakeholders.
Location:

  • Johannesburg, Gauteng
Workplace Type:

  • Hybrid
Job Type:

  • Contract
Experience Type:

  • Senior
We encourage you to apply - Kivara Rajgopal on [Email Address Removed] or via [Phone Number Removed];



Desired Skills:

  • Devops
  • Kafka
  • AWS
  • Databricks
  • Oracle
  • Docker
  • Code pipeline

Desired Qualification Level:

  • Degree
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.