Enable job alerts via email!

Senior Specialist Cloud Data Engineer

Vodafone Global Enterprise

Cape Town

On-site

ZAR 700,000 - 900,000

Full time

2 days ago
Be an early applicant

Job summary

A leading telecommunications company is seeking a Senior Specialist Cloud Data Engineer in Cape Town. This role focuses on designing, developing, and maintaining data infrastructure that supports efficient data processing. The ideal candidate will have strong programming skills, big data technology expertise, and relevant cloud certifications. This position offers competitive benefits and the opportunity to drive innovation within the organization.

Benefits

Enticing incentive programs
Competitive benefit packages
Retirement funds
Medical aid benefits
Cell phone and data benefits

Qualifications

  • 3 year degree or diploma in Computer Science, IT or related field is essential.
  • 5 to 8 years relevant experience.
  • AWS, GCP or Azure cloud certification at professional or associate level is required.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using batch and real-time technologies.
  • Integrate edge applications and cloud services for real-time data access.
  • Monitor and optimize performance of data applications and services.

Skills

Python
Java
Scala
Spark
Hive
MongoDB
CassandraDB
AWS
GCP
Azure

Education

Bachelor's degree in Computer Science, IT or related field

Tools

Docker
Kubernetes
OpenShift
Nginx

Job description

Job title : Senior Specialist Cloud Data Engineer

Job Location : Western Cape, Cape Town Deadline : August 08, 2025 Quick Recommended Links

  • Jobs by Location
  • Job by industries

StartFragment

Role Purpose / Business Unit :

  • The primary purpose of this, Senior Data Engineer role, is to design, develop, and maintain robust, scalable, and secure data and analytics infrastructure that supports batch and real-time data processing at scale.
  • This includes managing on-premises big data platforms, edge applications, and cloud deployments to ensure seamless integration and optimal performance across all environments.
  • The role is crucial in driving innovation, ensuring data integrity, and delivering actionable insights that empower the organization to make data-driven decisions.
  • Key Objectives

  • Data Pipeline Excellence : Build and maintain efficient data pipelines that handle large volumes of data with high reliability and performance.
  • Edge and Cloud Integration : Seamlessly integrate edge applications and cloud services to provide real-time data access and machine learning capabilities.
  • Innovation and Improvement : Continuously seek opportunities to enhance the data infrastructure, adopt new technologies, and improve processes.
  • Collaboration and Leadership : Work closely with cross-functional teams to understand their data needs and provide technical leadership and guidance.
  • Your responsibilities will include :

  • Data Pipeline Management : Design, develop, and maintain scalable data pipelines using batch technologies (like Spark, NiFi, and Hive) and real-time technologies (like Kafka, Flink, Spark streams).
  • Edge Application Development : Implement and manage edge applications using MongoDB and CassandraDB, ensuring efficient data processing and storage.
  • Microservices and Containerization : Develop and deploy microservices in an OpenShift containerized environment, utilizing tools like Nginx API Gateway for real-time data access.
  • Cloud Deployment and Management : Implement and support similar use cases in AWS, ensuring seamless integration between on-premise and cloud environments.
  • Performance Monitoring and Optimization : Continuously monitor and optimize the performance of data pipelines, applications, and services.
  • Security and Compliance : Ensure all systems and data processes comply with relevant security standards and regulations.
  • Technology Stack Selection : Recommend / make decisions on the appropriate technologies and tools to use for various components of the data and analytics infrastructure.
  • Architecture Design : Define the architecture for data pipelines, edge applications, and microservices to ensure scalability and reliability.
  • Resource Allocation : Allocate resources effectively to balance performance, cost, and scalability across on-premise and cloud environments.
  • Data Governance and Compliance : Establish and enforce data governance policies to ensure data quality, security, and compliance.
  • Incident Management : Lead the response to any incidents or outages, ensuring quick resolution and minimal impact on operations.
  • Innovation and Improvement : Continuously seek opportunities to improve processes, adopt new technologies, and drive innovation within the team.
  • The ideal candidate for this role will have :

  • 3 year Computer Science, IT or IS degree or diploma or related field is essential
  • 5 to 8 years relevant experience
  • Relevant AWS, GCP or Azure cloud certification at professional or associate level
  • Data engineering or related software development experience
  • Agile exposure working with Kanban or Scrum
  • Key Competencies

  • Technical Proficiency : Strong skills in programming languages such as Python, Java, or Scala
  • Big Data Technologies : Expertise in tools like Spark, Hive, parquet, iceberg, etc.
  • Database Management : Proficiency with both relational & NoSQL databases (e.g., MongoDB, CassandraDB)
  • Cloud Computing : Experience with cloud platforms like AWS / GCP, including services for data storage and processing.
  • Containerization & Microservices : Knowledge of containerization technologies (e.g., Docker, Kubernetes) and microservices architecture, particularly in OpenShift, AWS ECS and GCP GKE environments
  • API Management : Experience with API gateways like Nginx and developing APIs for real-time data access
  • Knowledge Areas

  • Distributed Systems : In-depth understanding of distributed computing principles and technologies
  • Data Engineering : Knowledge of data pipeline design, ETL processes, and data integration
  • Security and Compliance : Familiarity with data security practices and regulatory compliance requirements
  • Performance Optimization : Techniques for monitoring and optimizing the performance of data systems and applications
  • Edge Computing : Understanding of edge computing concepts and technologies for processing data closer to the source
  • Experience

  • Hands-On Experience : Several years of experience in software engineering, data engineering, or cloud engineering roles
  • Project Management : Experience managing complex projects, preferably in a big data or cloud environment.
  • Team Collaboration : Proven ability to work effectively in cross-functional teams and communicate technical concepts to non-technical stakeholders
  • Problem-Solving : Strong analytical and problem-solving skills, with a track record of addressing complex technical challenges
  • Continuous Learning : Commitment to staying updated with the latest technologies and best practices in the field
  • We make an impact by offering :

  • Enticing incentive programs and competitive benefit packages
  • Retirement funds, risk benefits, and medical aid benefits
  • Cell phone and data benefits, advantages fibre connection discounts, and exclusive staff discounts offered in collaboration with partner companies
  • Closing date for Applications : 04 August 2025.

    EndFragment

  • Research / Data Analysis jobs
  • Get your free, confidential resume review.
    or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.