Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer & Data Architect

multiSEARCH Recruitment

Johannesburg

On-site

ZAR 600 000 - 800 000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A recruitment agency is seeking an experienced data engineer to design and maintain scalable data architectures and build robust ETL pipelines. The ideal candidate will have over 5 years of experience in data engineering, strong proficiency in SQL and programming languages like Python. They will ensure compliance with fintech regulatory requirements and collaborate closely with stakeholders on data strategies. This role is an opportunity to drive analytics and BI workloads on robust platforms.

Qualifications

  • 5+ years of experience in data engineering, architecture, or relevant roles.
  • Strong proficiency in SQL and data modeling.
  • Experience building data pipelines using tools like Airflow or Spark.

Responsibilities

  • Design and maintain scalable data architectures.
  • Build robust ETL/ELT pipelines for data processing.
  • Ensure regulatory compliance for financial data.

Skills

Data engineering
Data architecture
SQL proficiency
Python programming
ETL pipeline development

Tools

Airflow
dbt
Spark
BigQuery
Snowflake
Job description
Overview
  • Design and maintain scalable, secure, high-performance data architectures
  • Build robust ETL/ELT pipelines for batch and streaming data
  • Enable analytics, BI, and AI workloads through reliable data platforms
  • Ensure regulatory compliance and data governance for sensitive financial data
Data Architecture & Design
  • Design and maintain scalable, secure, and high-performance data architectures
  • Define data models, schemas, and standards across transactional and analytical systems
  • Architect data lakes, data warehouses, and real-time data pipelines
  • Ensure alignment with fintech regulatory and compliance requirements
Data Engineering
  • Build and maintain ETL/ELT pipelines for batch and streaming data
  • Integrate data from internal systems, third-party APIs, and financial data providers
  • Optimize data processing for performance, reliability, and cost efficiency
  • Ensure data quality, consistency, and availability
Cloud & Infrastructure
  • Develop and manage cloud-based data platforms (AWS, GCP, or Azure)
  • Implement infrastructure-as-code and CI/CD for data pipelines
  • Monitor and troubleshoot data systems in production
Security & Governance
  • Implement data security, access controls, encryption, and monitoring
  • Support data governance, lineage, and metadata management
  • Collaborate with compliance and risk teams on audits and regulatory needs
Collaboration & Leadership
  • Work closely with product managers, analysts, and ML engineers
  • Provide technical guidance and best practices for data usage
  • Document architectures, workflows, and standards
  • Mentor junior data engineers (if applicable)
Minimum Requirements
  • 5+ years of experience in data engineering, data architecture, or related roles
  • Strong proficiency in SQL and data modelling (dimensional and/or normalized)
  • Experience building data pipelines using tools such as Airflow, dbt, Spark, or similar
  • Strong programming skills in Python, Scala, or Java
  • Hands-on experience with cloud data services (BigQuery, Redshift, Snowflake, Databricks)
  • Experience with both batch and streaming data systems (Kafka, Kinesis, Pub/Sub)
  • Solid understanding of data security, privacy, and governance principles
Preferred Qualifications
  • Experience in fintech, banking, payments, or financial services
  • Familiarity with regulatory frameworks (PCI DSS, SOC 2, GDPR)
  • Experience supporting analytics, BI, and machine learning workloads
  • Knowledge of event-driven and real-time data architectures
  • Prior experience leading architectural decisions or data platform migrations
Core Technical Competencies
  • Data Architecture & Modelling – Scalable, resilient designs for transactional and analytical workloads
  • Data Engineering & Pipelines – Reliable batch and streaming pipelines with strong data quality
  • Cloud & Platform Engineering – Proficient in AWS, GCP, or Azure with IaC and CI/CD
  • Programming & Querying – Advanced SQL and Python/Scala/Java expertise
  • Data Security & Governance – Encryption, access control, compliance readiness
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.