Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Sabenza IT & Recruitment

Johannesburg

On-site

ZAR 700 000 - 900 000

Full time

Yesterday
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading IT recruitment agency is hiring a Data Engineer to enhance fraud detection and financial crime technology. The successful candidate will design ETL pipelines in Azure, work with Databricks, and ensure high-quality data management. Candidates should have proven experience with SQL, data modeling, and Agile methodologies. This contract role focuses on modernizing the bank's systems, providing an exciting opportunity to be part of significant technological advancements in a dynamic environment.

Qualifications

  • Proven experience in Azure Data Factory, Databricks, and Synapse Analytics.
  • Strong SQL skills (T-SQL, stored procedures, optimization).
  • Hands-on experience with REST API integrations.

Responsibilities

  • Design and develop end-to-end ETL and data pipelines.
  • Build and maintain Enterprise Data Warehouse assets.
  • Work closely with business teams to translate requirements.

Skills

Azure Data Factory
Databricks
Synapse Analytics
SQL
REST API
Data Vault 2.0
Agile
Azure Analysis Services
Job description

We’re Hiring: Data Engineer – Fraud & Financial Crime Technology Platform (6-Month Contract)

Are you a talented Data Engineer passionate about transforming data into actionable insights? Join our Fraud and Financial Crime Technology Platform (FFCP) team and play a key role in modernising the bank’s fraud detection and financial crime systems.

As part of our FFCP team, you’ll be at the forefront of innovation, helping replace legacy systems with cutting‑edge cloud‑native solutions. You will work on Client Screening and Transaction Monitoring capabilities, ensuring data flows seamlessly to power our fraud detection and investigation engines.

Requirements
What You’ll Do:
  • Design and develop end-to-end ETL and data pipelines in Azure, moving away from manual flat‑file feeds to automated, resilient solutions.

  • Build and maintain Enterprise Data Warehouse (EDW) assets, data marts, and business‑focused data products using Data Vault 2.0 methodologies.

  • Work closely with business teams to translate requirements into robust data models and technical pipelines.

  • Implement monitoring, alerting, reconciliation, and exception handling to ensure high‑quality, secure, and auditable data.

  • Integrate with external systems and APIs, handling authentication, pagination, and error management.

  • Support CI/CD deployments, DevOps pipelines, and operational readiness for production environments.

  • Collaborate with architects, engineers, and compliance teams to ensure alignment with overall data strategy and regulatory standards.

What You’ll Bring:
  • Proven experience in Azure Data Factory, Databricks, Synapse Analytics, and ADLS Gen2.

  • Strong SQL skills (T‑SQL, stored procedures, functions, optimization) and experience with SQL Server/SSIS.

  • Hands‑on experience with REST API integrations.

  • Solid knowledge of data modelling (Data Vault 2.0, dimensional/Kimball).

  • Experience with Azure Analysis Services, CI/CD pipelines, and infrastructure‑as‑code.

  • Excellent analytical, problem‑solving, and collaborative skills.

  • Comfortable working in Agile or DevOps‑style delivery environments.

Nice to Have:
  • Experience with financial crime systems (Actimize, Dow Jones, or similar).

  • Familiarity with event‑driven architectures, microservices, and data products.

  • Understanding of KYC, AML, sanctions frameworks, and financial markets.

  • Programming skills in C#/.Net Core.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.