Enable job alerts via email!

Data Engineer

Standard Bank Of South Africa Limited

Gauteng

On-site

ZAR 600 000 - 900 000

Full time

Yesterday
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company in the Insurance & Asset Management sector is seeking a Data Engineer to develop and maintain data architecture. The role involves designing data pipelines, ensuring data accessibility, and executing engineering duties. Candidates should have a relevant degree and 5-7 years of experience in data solutions.

Qualifications

  • 5-7 years in building databases, warehouses, reporting, and data integration solutions.
  • Experience with big data pipelines and architectures.
  • Proficiency in database programming languages.

Responsibilities

  • Develop and maintain complete data architecture across multiple application platforms.
  • Design, build, operationalise, secure, and monitor data pipelines.
  • Ensure data is accessible for evaluation and optimisation.

Skills

Database programming languages
Data principles
Data integration
Performance optimisation
API integration

Education

First Degree in Business Commerce, Information Studies, or Information Technology

Tools

SQL
PL/SQL
SPARK

Job description

Business Segment :

Insurance & Asset Management

Location :

ZA, GP, Roodepoort, Ellis Street

Job Responsibilities :
  • Develop and maintain complete data architecture across multiple application platforms.
  • Design, build, operationalise, secure, and monitor data pipelines and data stores in accordance with architecture, standards, policies, and governance requirements.
  • Ensure data is accessible for evaluation and optimisation for downstream use cases.
  • Execute data engineering duties following standards, frameworks, and roadmaps.
Qualifications :
  • First Degree in Business Commerce, Information Studies, or Information Technology.
Experience Required :
  • 5-7 years in building databases, warehouses, reporting, and data integration solutions.
  • Experience with big data pipelines, architectures, and datasets.
  • Experience in creating and integrating APIs.
  • Proficiency in database programming languages such as SQL, PL/SQL, SPARK, or similar tools.
  • Experience with data pipeline and workflow management tools.
  • Understanding of data principles, pipelining, performance optimisation, and organisational data integration.
Additional Notes :

All recruitment processes comply with applicable laws. We will never ask for money or payments during recruitment. For suspicious activity, contact our Fraud line.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.