Enable job alerts via email!

OM Bank - Data Engineer

Old Mutual

Johannesburg

On-site

ZAR 700 000 - 900 000

Full time

Yesterday
Be an early applicant

Job summary

A leading financial services organization in Johannesburg seeks an AWS Data Engineer to develop cloud-based data products and warehouse solutions. The ideal candidate will have extensive experience in building scalable data pipelines using AWS technologies and tools like Apache Airflow and dbt. A Bachelor's degree in Computer Science or a related field is required, along with a minimum of 3-5 years of relevant experience. The role includes responsibilities in operational delivery and technical leadership to enhance business insights.

Qualifications

  • 3-5 years experience developing Data Pipelines using AWS technologies.
  • Experience in Data Vault and Dimensional modeling techniques.
  • Experience working in a high availability DataOps environment.

Responsibilities

  • Develop data products & data warehouse solutions in cloud environments.
  • Build data APIs and delivery services for operational applications.
  • Design and develop data models using dimensional modelling techniques.

Skills

Action Planning
Data Management
Data Modeling
Database Administration
Orchestration with Apache Airflow
CI/CD

Education

Bachelor’s Degree in Computer Science or Information Systems

Tools

AWS
GitHub
dbt Core
Job description

Let\'s Write Africa\'s Story Together!

Old Mutual is a firm believer in the African opportunity and our diverse talent reflects this.

Job Description

Develop data products & data warehouse solutions in cloud environments using cloud-based services, platforms and technologies. As an AWS Data Engineer, you will design and maintain data analytic road maps and data structures that support business and technology objectives.

Dynamic and results-driven Data Engineer with extensive experience in designing, developing, and deploying high-performance, re-usable data engineering pipelines and data products using Python, Dbt and Apache Airflow.

Proven expertise in building scalable data pipelines and real-time processing systems that enhance operational efficiency and drive business insights. Strong background in microservices architecture, cloud technologies, and agile methodologies.

KEY RESULT AREAS

Operational Delivery

  • Building the Data Lake using AWS technologies like S3, Athena ,Quicksight
  • Building data APIs and data delivery services to support critical operational and analytical applications
  • Participate in the development of workflow, coding, testing and deployment solutions
  • Implement unit testing for all assigned deliverables to ensure deployment success
  • Support and maintain solutions
  • Design and develop data models using dimensional modelling and data vault techniques
  • Work from high level requirements through to detailed specifications, prototypes, software deployment and administration
  • Deliver incremental business value per delivery phase(sprint/cycle)
  • Deliver iteratively throughout the cycle
  • Conduct peer reviews within and across squads
  • Profile and analyse data sets

Technical Leadership

  • Participate in the engineering and other discipline’s community of practice
  • Share AWS knowledge and practical experience with community
  • Challenge and contribute to development of architectural principles and patterns

Compliance

  • Ensure solutions adhere to OM Bank patterns, guidelines and standards
  • Operate within project environments and participate in continuous improvement efforts

Delivery Management

  • Follow and participate in defined ways of work including, but not limited to, sprint planning, backlog grooming, retrospectives, demos and PI planning

ROLE REQUIREMENTS

  • Experience of developing solutions in the cloud
  • Minimum of 3-5 years\' experience with designing and developing Data Pipelines for Data Ingestion or Transformation using AWS technologies
  • Experience in developing data warehouses and data marts
  • Experience in Data Vault and Dimensional modelling techniques
  • Experience working in a high availability DataOps environment
  • Experience working with automated data warehousing solutions would be advantageous
  • Orchestration with Apache Airflow
  • CI/CD
  • Github
  • dbt Core

Qualifications

  • Bachelor’s Degree in Computer Science or similar fields like Information Systems, Big Data, etc.
  • AWS Data Engineer Certification would be advantageous
  • Related Technical certifications

Skills

Action Planning, Business Requirements Analysis, Computer Literacy, Database Administration, Database Reporting, Data Compilation, Data Controls, Data Management, Data Modeling, Executing Plans, Gaps Analysis, Information Technology (IT) Support, IT Architecture, IT Implementation, IT Network Security, Market Analysis, Test Case Management, User Requirements Documentation

Competencies

Action Oriented Business Insight Cultivates Innovation Drives Results Ensures Accountability Manages Complexity Optimizes Work Processes Persuades

Education

Bachelors Degree (B)

Closing Date

01 December 2025 , 23:59

The appointment will be made from the designated group in line with the Employment Equity Plan of Old Mutual South Africa and the specific business unit in question.

The Old Mutual Story!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.