Enable job alerts via email!

Om Bank - Data Engineer

Old Mutual

Cape Town

On-site

ZAR 600,000 - 900,000

Full time

14 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in South Africa is looking for an AWS Data Engineer to develop data products and warehouse solutions in cloud environments. The successful candidate will have a strong background in building scalable data pipelines and will work with technologies like Python, Dbt, and Apache Airflow. This role requires a Bachelor's degree in Computer Science and relevant certifications, with a focus on delivering high-performance data engineering solutions.

Qualifications

  • Minimum of 3-5 years' experience in Data Pipeline development using AWS.
  • Experience with Data Vault and Dimensional modeling techniques.
  • Bachelor’s Degree in Computer Science or similar fields.

Responsibilities

  • Develop data products and data warehouse solutions in cloud environments.
  • Build data APIs and delivery services for operational applications.
  • Design and develop data models using dimensional modeling techniques.

Skills

Action Planning
Business Requirements Analysis
Data Management
Data Modeling
Database Administration
IT Architecture
Test Case Management

Education

Bachelor’s Degree in Computer Science
AWS Data Engineer Certification

Tools

Apache Airflow
GitHub
dbt Core

Job description

Let's Write Africa's Story Together!

Old Mutual is a firm believer in the African opportunity, and our diverse talent reflects this.

Job Description

Develop data products & data warehouse solutions in cloud environments using cloud-based services, platforms, and technologies. As an AWS Data Engineer, you will design and maintain data analytic roadmaps and data structures that support business and technology objectives.

We are seeking a dynamic and results-driven Data Engineer with extensive experience in designing, developing, and deploying high-performance, reusable data engineering pipelines and data products using Python, Dbt, and Apache Airflow.

Proven expertise in building scalable data pipelines and real-time processing systems that enhance operational efficiency and drive business insights. A strong background in microservices architecture, cloud technologies, and agile methodologies is essential.

KEY RESULT AREAS

Operational Delivery

  • Build the Data Lake using AWS technologies like S3, Athena, Quicksight
  • Build data APIs and data delivery services to support critical operational and analytical applications
  • Participate in the development of workflows, coding, testing, and deployment solutions
  • Implement unit testing for all assigned deliverables to ensure deployment success
  • Support and maintain solutions
  • Design and develop data models using dimensional modeling and data vault techniques
  • Work from high-level requirements through to detailed specifications, prototypes, software deployment, and administration
  • Deliver incremental business value per delivery phase (sprint/cycle)
  • Deliver iteratively throughout the cycle
  • Conduct peer reviews within and across squads
  • Profile and analyze data sets

Technical Leadership

  • Participate in the engineering and other discipline’s community of practice
  • Share AWS knowledge and practical experience with the community
  • Challenge and contribute to the development of architectural principles and patterns

Compliance

  • Ensure solutions adhere to OM Bank patterns, guidelines, and standards
  • Operate within project environments and participate in continuous improvement efforts

Delivery Management

  • Follow and participate in defined ways of work including sprint planning, backlog grooming, retrospectives, demos, and PI planning

ROLE REQUIREMENTS

  • Experience developing solutions in the cloud
  • Minimum of 3-5 years' experience designing and developing Data Pipelines for Data Ingestion or Transformation using AWS technologies
  • Experience in developing data warehouses and data marts
  • Experience with Data Vault and Dimensional modeling techniques
  • Experience working in a high availability DataOps environment
  • Experience with automated data warehousing solutions is advantageous
  • Orchestration with Apache Airflow
  • CI/CD
  • GitHub
  • dbt Core

Qualifications

  • Bachelor’s Degree in Computer Science or similar fields like Information Systems, Big Data, etc.
  • AWS Data Engineer Certification is advantageous
  • Related technical certifications

Skills

Action Planning, Business Requirements Analysis, Computer Literacy, Database Administration, Database Reporting, Data Compilation, Data Controls, Data Management, Data Modeling, Executing Plans, Gaps Analysis, IT Support, IT Architecture, IT Implementation, IT Network Security, Market Analysis, Test Case Management, User Requirements Documentation

Competencies

Action Oriented, Business Insight, Cultivates Innovation, Drives Results, Ensures Accountability, Manages Complexity, Optimizes Work Processes, Persuades

Education

Bachelor's Degree (B)

Closing Date

27 May 2025, 23:59

The appointment will be made from the designated group in line with the Employment Equity Plan of Old Mutual South Africa and the specific business unit in question.

The Old Mutual Story!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.