Enable job alerts via email!

Senior Data Engineer

Betterhome Group Ltd

Pretoria

On-site

ZAR 800 000 - 1 200 000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading South African property and financial services group is seeking a Senior Data Engineer in Pretoria. The ideal candidate will have over 8 years of experience in data engineering and will be responsible for designing and maintaining data pipelines, ensuring data integrity and scalability. Knowledge of cloud platforms like Azure and strong SQL skills are essential. Competitive salary and opportunities for growth are offered.

Benefits

Competitive salary
Ongoing learning and development opportunities

Qualifications

  • 8+ years of experience in data engineering or data architecture.
  • Proven experience building and maintaining data pipelines using modern tools.
  • Strong proficiency in SQL and data warehouse design.

Responsibilities

  • Design, implement, and maintain ETL/ELT data pipelines.
  • Develop and maintain scalable data models for analytics.
  • Optimize data storage and performance for near real-time reporting.

Skills

Data pipeline development
SQL proficiency
Data modelling
Data governance understanding
Cloud platforms experience
Analytical skills

Education

Bachelor's degree in Computer Science or related field

Tools

Azure Data Factory
Synapse Analytics
Databricks
Job description

Job Title : Senior Data Engineer Location : Hazelwood, Pretoria Department : Data & Technology Reports To : Head of IT infrastructure / Data

About Betterhome Group: BetterHome Group is a leading South African property and financial services group, home to brands such as BetterBond, Private Property, MortgageMax and BetterSure. We’re transforming how South Africans buy, sell, and finance property through innovation, data, and technology.

Role Overview

We’re looking for a Senior Data Engineer to play a key role in shaping the foundation of our data ecosystem.

This role will design, implement, and maintain the data pipelines and models that power analytics and reporting across the Group.

You’ll work closely with cross-functional teams to ensure data integrity, scalability, and reliability, helping BetterHome Group move towards real-time, data-driven decision-making.

Key Responsibilities
  • Design, implement, and maintain ETL / ELT data pipelines that form the backbone of BetterHome Group’s reporting platform.
  • Develop and maintain scalable data models to support analytics and reporting requirements across multiple business units.
  • Continuously enhance data models and the data warehouse to improve performance, reliability, and analytical capability.
  • Optimise data storage, performance, and cost, with a focus on enabling near real-time reporting.
  • Collaborate with business and technology teams to translate business needs into practical, high-impact data solutions.
  • Implement and maintain data quality, validation, and monitoring mechanisms.
  • Manage and maintain operational data sources, ensuring stability and accuracy.
  • Enforce data governance, security, and compliance best practices.
  • Conduct cross-system monitoring to ensure business processes run as expected.
  • Engage effectively with stakeholders at all levels, communicating technical concepts clearly and confidently.
Skills and Experiences
  • Essential : Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field.
  • 8+ years of experience in data engineering or data architecture, ideally within a data-driven or technology-focused organisation.
  • Proven experience building and maintaining data pipelines (ETL / ELT) using modern data tools and platforms.
  • Strong proficiency in SQL, data modelling, and data warehouse design.
  • Experience with cloud platforms (Azure preferred) and associated data services (e.g., Azure Data Factory, Synapse Analytics, or Databricks).
  • Solid understanding of data governance, security, and compliance principles.
  • Experience working in cross-functional teams and translating business requirements into technical solutions.
  • Excellent analytical, problem-solving, and communication skills.
Nice to Have
  • Experience with Microsoft Fabric would be a significant bonus.
  • Familiarity with real-time data streaming, DevOps practices, or CI / CD for data pipelines.
  • Knowledge of Python or other scripting languages for automation and data processing.
What You’ll Get

An opportunity to help shape the data landscape of a fast-growing, technology-led organisation. A collaborative, forward-thinking team environment that values innovation and impact. Competitive salary and access to ongoing learning and development opportunities.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.