Enable job alerts via email!

Data Engineer

Kuda Technologies Ltd

Cape Town

On-site

ZAR 480 000 - 720 000

Full time

6 days ago
Be an early applicant

Job summary

A financial technology company is seeking a Data Engineer to design and maintain data pipelines, integrate diverse data sources, and collaborate on data initiatives. The ideal candidate has 3-5 years of experience, strong SQL skills, and familiarity with cloud platforms like GCP. The role emphasizes delivering data solutions to enhance analytics capabilities.

Qualifications

  • 3–5 years of experience as a Data Engineer or in a similar role.
  • Strong proficiency in SQL for query design and optimization.
  • Experience with relational databases like Microsoft SQL Server or PostgreSQL.

Responsibilities

  • Design, build, and maintain data ingestion and processing pipelines.
  • Integrate and transform data from various sources into high-quality datasets.
  • Collaborate with stakeholders to translate needs into data models.

Skills

Data pipeline design and maintenance
SQL proficiency
Data integration
Machine learning support
Cloud platforms (GCP)
Programming (Python preferred)

Education

Bachelor's degree in computer science or related field

Tools

Azure Data Factory
Google BigQuery
Looker
Job description

Kuda is a money app for Africans on a mission to make financial services accessible, affordable and rewarding for every African on the planet.
We’re a tribe of passionate and diverse people who dreamed of building an inclusive money app that Africans would love so it’s only right that we ended up with the name ‘Kuda’ which means ‘love’ in Shona, a language spoken in the southern part of Africa.
We’re giving Africans around the world a better alternative to traditional finance by delivering money transfers, smart budgeting and instant access to credit through digital devices.
We’ve raised over $90 million from some of the world's most respected institutional investors, and we’re rolling out our game-changing services globally from our offices in Nigeria, South Africa, and the UK.

Role Overview

We are expanding and seeking a Data Engineer to join our ranks and champion growth. With a passion for data-driven decision-making, you will play a pivotal role in shaping the future of banking for millions.

Roles and responsibilities
  • Design, build, and maintain scalable data ingestion and processing pipelines that support analytics, machine learning, and operational use cases.
  • Integrate and transform data from diverse structured and unstructured sources into trusted, high-quality datasets.
  • Collaborate with business stakeholders to translate analytical and operational needs into efficient data models and pipelines.
  • Monitor and optimise data pipeline performance, ensuring reliability, observability, and cost efficiency.
  • Support data quality, lineage, and metadata management through governance-aligned frameworks and automation.
  • Enable prescriptive and predictive analytics by preparing and curating datasets for machine learning models.
  • Partner with data scientists and ML engineers to deploy and operationalise ML models into production pipelines.
  • Continuously evaluate and implement modern tools and frameworks to improve data acquisition, orchestration, and delivery.
  • Develop analytical datasets, metrics layers, and transformations that power dashboards, self-service analytics, and advanced insights.
  • Collaborate closely with data architects, platform engineers, and business teams to ensure alignment between data strategy, architecture, and delivery.
  • 3–5 years of hands-on experience as a Data Engineer or in a similar data-focused engineering role.
  • Strong proficiency in SQL, including advanced query design, optimisation, and performance tuning.
  • Proven experience working with relational databases such as Microsoft SQL Server, Azure SQL Database / Managed Instance, or PostgreSQL.
  • Demonstrated ability to integrate, transform, and consolidate data from heterogeneous sources (e.g., Azure SQL, SQL Managed Instances, or BigQuery) into analytics-ready datasets on Google Cloud Platform (GCP) using BigQuery and visualised through Looker or similar BI tools.
  • Exposure to ETL/ELT frameworks such as SQL Server Integration Services (SSIS), dbt Cloud, or Dataform.
  • Skilled in the design, implementation, monitoring, and optimisation of modern data platforms and pipelines to support analytical and operational workloads.
  • Experience with Azure Data Factory (ADF) or similar orchestration tools is a strong advantage.
  • Familiarity with one or more programming languages such as Python, Java, Scala, R, or C# (Python preferred).
  • Experience applying modern software engineering practices, including version control, CI/CD, and Agile methodologies.
  • Strong computational and analytical thinking skills, with the ability to design efficient and scalable data solutions.
  • Bachelor’s degree in computer science, Engineering, or related technical discipline or equivalent practical experience in data engineering.
  • Data Stack: Azure Microsoft SQL Server (MI), Fivetran, Airbyte, Google Big Query, Google Cloud Platform (GCP), Dbt Cloud, DataForm (GCP), Looker.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.