Job Search and Career Advice Platform

Enable job alerts via email!

WFS Senior Data Ops Engineer

Woolworths Financial Services

Cape Town

Hybrid

ZAR 600 000 - 800 000

Full time

10 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A reputable financial service provider in Cape Town seeks a Senior DataOps Engineer to design and maintain data pipelines in a cloud-native environment. The ideal candidate will have over 6 years of experience in DataOps, strong skills in AWS data services, and proficiency in scripting with Python or Shell. The role includes opportunities for professional development, competitive compensation, and flexible work options, promoting a collaborative culture focused on innovation.

Benefits

Competitive compensation
AWS certification support
Flexible work options

Qualifications

  • 6+ years in data engineering, DevOps, or DataOps roles.
  • Strong understanding of CI/CD tools and infrastructure as code.
  • Management of data pipelines and orchestration frameworks.

Responsibilities

  • Design, build, and maintain CI/CD pipelines for data services.
  • Automate data pipeline orchestration.
  • Implement logging, monitoring, and alerting for data operations.

Skills

AWS data services (S3, Glue, Lambda, Redshift, RDS, Athena, Step Functions)
Python or Shell scripting
CI/CD tools (e.g., BitBucket, GitLab CI, Jenkins, GitHub Actions)
Infrastructure as Code (Terraform, CloudFormation)
Data pipeline design and testing
Modern orchestration frameworks (Airflow, Dagster, Prefect)

Education

Bachelor's degree in computer science, Engineering, or related field

Tools

AWS CloudWatch
Datadog
Prometheus
Kafka
Kinesis
Confluent Cloud
Kubernetes
Job description

Woolworths Financial Services | Full time

Cape Town, South Africa | Posted on 09/03/2025

Woolworths Financial Services, or WFS as it is better known, is a Joint Venture with Absa Bank, that supports the Woolworths retail business by providing in-store credit in the form of the Woolworths StoreCard and offering value‑added services including credit cards, personal loans and short‑term insurance as well as life insurance linked to other products.

Job Description
Main Purpose

As a Senior DataOps Engineer, you will be responsible for designing, automating, and maintaining robust data pipelines and infrastructure that enable continuous data integration, delivery, and observability in a dynamic cloud environment. You will work alongside data engineers, architects, analysts, and platform teams to ensure data is flowing securely, efficiently, and reliably from source to destination.

Key Responsibilities
  • Design, build, and maintain CI/CD pipelines for data services and workflows.
  • Automate data pipeline orchestration using tools like Apache Airflow, AWS Step Functions, or Prefect.
  • Ensure data quality, testing, and monitoring are integrated into pipelines using tools like dbt, Great Expectations, or similar.
  • Collaborate with engineering and analytics teams to promote data infrastructure as code using Terraform or AWS CloudFormation.
  • Implement logging, monitoring, and alerting for data operations using services like AWS CloudWatch, Datadog, or Prometheus.
  • Support deployment and version control of data models, transformations, and schemas.
  • Drive adoption of DevOps and Agile best practices in data projects.
  • Lead initiatives to improve data pipeline performance, cost optimization, and scalability.
  • Troubleshoot and resolve pipeline failures or data integrity issues in a production environment.
Requirements
  • Bachelor’s degree in computer science, Engineering, or related field.
  • 6+ years in data engineering, DevOps, or DataOps roles.
  • Hands‑on experience with AWS data services like: S3, Glue, Lambda, Redshift, RDS, Athena, Step Functions.
  • Strong scripting and programming skills in Python or Shell.
  • Deep understanding of CI/CD tools (e.g., BitBucket, GitLab CI, Jenkins, GitHub Actions).
  • Experience with infrastructure as code (e.g., Terraform, CloudFormation).
  • Familiarity with modern orchestration frameworks (e.g., Airflow, Dagster, Prefect).
  • Expertise in data pipeline design, data testing, and metadata management.
Preferred Qualifications
  • Understanding of data governance frameworks and cataloguing tools.
  • Exposure to CI/CD, Terraform, and DataOps pipelines.
  • AWS certification (e.g., Data Analytics or Solutions Architect) is a plus.
  • Experience with real‑time streaming technologies such as Kafka, Kinesis, or Confluent Cloud.
  • Familiarity with containerization and Kubernetes.
  • Exposure to dbt, Great Expectations, or similar testing frameworks.
  • AWS certification (e.g., DevOps Engineer, Data Analytics Specialty) is highly desirable.
  • Strong problem‑solving and incident response skills.
  • High attention to detail and focus on reliability.
  • Effective communicator, capable of working cross‑functionally.
  • Passion for automation, efficiency, and clean documentation.
What We Offer
  • A cloud‑native environment focused on innovation and learning.
  • Opportunities for professional development and AWS certification support.
  • Competitive compensation and benefits.
  • Flexible work options, including hybrid or remote work.
  • A culture of collaboration and data‑driven impact.
Personal Attributes
  • Curiosity and passion for understanding technology.
  • Analytical, with an affinity to solve problems.
  • Ability to work under pressure in a fast‑paced, dynamic environment.
  • Eager to embrace unfamiliar situations with a positive mindset.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.