Enable job alerts via email!

Data Engineer

A38A7Ccc-Bffb-4Fde-9A77-3F4B59Df0F2F

Cape Town

Remote

ZAR 500 000 - 700 000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A technology firm specializing in data solutions is seeking a fully remote Data Engineer to develop lakehouses and reporting solutions. The ideal candidate has over 3 years of data engineering experience, strong skills in SQL and Python, and is familiar with ETL/ELT and cloud data services. This role emphasizes collaboration within a dynamic team, offering flexibility and a focus on quality data solutions.

Benefits

Remote work
R15 annual study budget
Flexible working hours

Qualifications

  • 3+ years of experience in data engineering.
  • Strong programming skills in SQL and Python.
  • Hands-on experience with ETL / ELT and orchestration tools.

Responsibilities

  • Design, build, and maintain scalable ETL / ELT data pipelines.
  • Work with structured and unstructured data, ensuring data quality.
  • Engage with clients to translate requirements into solutions.

Skills

SQL
Python
Data Engineering
Problem Solving
Communication

Education

Degree in Computer Science, Engineering, or related field

Tools

Git
CI/CD
ETL/ELT tools
Cloud Data Services (GCP, Azure, AWS)
Job description
What We Do

PrediqAI builds advanced data and AI solutions for insurance companies, working across areas such as pet, motor, specialty and construction insurance.

We develop data platforms that are robust, scalable, and highly optimised, forming the foundation for AI-driven applications.

In addition, we develop machine learning-driven smart pricing engines, quoting tools and AI agents.

In essence, we help insurers make better, faster, and smarter decisions.

Why Work For Us

At PrediqAI, we work in small, dynamic teams where we iterate quickly, while always making sure we do things the right way and build solutions that last.

You’ll collaborate with talented colleagues who are passionate about learning, and we make it easy for you to stay on top of your game with a R15, annual study budget. While we operate like a start up, we are backed by an international insurance group and are their preferred data & AI partner, giving us the best of both worlds with innovation and stability.

We’re remote‑first, so you can work from anywhere, travel freely, and we meet in person roughly once per quarter, giving you flexibility without losing connection.

Our Culture
  • Set the standard : Insist on excellence in everything you do.
  • Be Human : Lead with humility and treat others with respect.
  • Get it done : Have a bias for action.
  • Adapt and evolve : Embrace new tools, technologies, and ways to improve.

Own your work and take pride in it.

Our Engineering Processes

Our engineering process is designed around collaboration, autonomy, and quality. We work in small, cross‑functional teams that own solutions end‑to‑end — from design discussions to deployment and monitoring.

Our stack leans modern and practical: Python, SQL, dbt, Spark / Databricks, and cloud‑native storage and orchestration.

We use CI / CD, version‑controlled transformations, and strong observability so we can deploy confidently and debug fast.

We’re big believers in code reviews, open communication, and leaving things better than we found them.

If you like shaping both the data platform and the culture around it, you’ll fit right in.

Your Role

We’re looking for a fully remote Data Engineer to join our team, to contribute to the development of lakehouses, and reporting solutions, ensuring high‑quality, reliable, and well‑governed data that supports our clients.

Responsibilities
  • Design, build, and maintain scalable ETL / ELT data pipelines.
  • Develop data models and ensure efficient data storage, retrieval, and processing.
  • Work with structured and unstructured data, ensuring data quality and integrity.
  • Deploy and manage end‑to‑end data processing solutions.
  • Ensure data security, compliance, and regulatory adherence in financial data workflows.
  • Engage with consultants and client teams to understand requirements and translate them into technical solutions.
  • Produce clear documentation for pipelines, datasets, and operational processes.
  • Identify and investigate data anomalies, tracing lineage to ensure accuracy, transparency, and trust in reporting.
Qualifications
  • 3+ years of experience in data engineering.
  • Technical background with a degree in Computer Science, Engineering, or a related field.
  • Strong programming skills in SQL and Python.
  • Experience with version control (Git) and CI / CD pipelines.
  • Hands‑on experience with ETL / ELT, orchestration tools, and cloud data services (GCP preferred, Azure or AWS acceptable).
  • Experience with conceptual, logical, and physical data modelling.
  • Strong problem‑solving mindset and attention to detail.
  • Collaborative team player with strong communication skills.
  • Experience in data transformations using dbt is beneficial.
  • Familiarity with insurance data structures or willingness to learn.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.