Enable job alerts via email!

Intermediate Data Engineer

NTT Ltd.

Johannesburg

On-site

ZAR 500 000 - 700 000

Full time

6 days ago
Be an early applicant

Job summary

A prominent technology firm seeks an Intermediate Data Engineer in Johannesburg. This role involves data transformation and developing data models for analytics. Candidates should have a Bachelor's degree in a related field, advanced SQL skills, and experience with tools like Microsoft Azure Data Factory. Join a dynamic team to enhance data-driven solutions.

Qualifications

  • Proficient in building data analytics solutions from large datasets.
  • Advanced experience with architecture of scalable systems.
  • Experience with automation and scripting.

Responsibilities

  • Design and develop scalable ETL packages.
  • Accountable for running data migrations across databases.
  • Work with stakeholders to define data requirements.

Skills

Problem-solving aptitude
Analytical mindset
Effective communication
Data architecture understanding
Microsoft Azure Data Factory
Python

Education

Bachelor's degree in computer science or related field
Relevant certifications (SAP, Microsoft Azure)

Tools

SQL Analysis Server
SAP Data Services
Microsoft SQL
Hadoop
Cassandra
Job description
Overview

Job title: Intermediate Data Engineer

Job Location: Gauteng, Johannesburg

Deadline: November 29, 2025

Quick Recommended Links
  • Jobs by Location
  • Job by industries
Your day at NTT DATA
  • The Intermediate Data Engineer is an advanced subject matter expert, accountable for the transformation of data into a structured format that can be easily analyzed in a query or report.
  • This role is responsible for developing structured data sets that can be reused or compliment by other data sets and reports.
  • This role analyzes the data sources and data structure and will design and develop data models to support the analytics requirements of the business which includes management / operational / predictive / data science capabilities.
Key Responsibilities :
  • Designs data models in a structured data format to enable analysis thereof.
  • Designs and develops scalable extract, transformation and loading (ETL) packages from the business source systems and the development of ETL routines to populate data from sources,
  • Participates in the transformation of object and data models into appropriate database schemas within design constraints.
  • Interprets installation standards to meet project needs and produces database components as required.
  • Directs test scenarios and is responsible for participating in thorough testing and validation to support the accuracy of data transformations.
  • Accountable for running data migrations across different databases and applications, for example MS Dynamics, Oracle, SAP and other ERP systems.
  • Works across multiple IT and business teams to define and implement data table structures and data models based on requirements.
  • Accountable for analysis, and development of ETL and migration documentation.
  • Works with various stakeholders to evaluate potential data requirements.
  • Accountable for the definition and management of scoping, requirements, definition, and prioritization activities for small-scale changes and assist with more complex change initiatives.
  • Networks with various stakeholders, contributing to the recommendation of improvements in automated and non-automated components of the data tables, data queries and data models.
Knowledge and Attributes :
  • Advanced knowledge of the definition and management of scoping requirements, definition and prioritization activities.
  • Advanced understanding of database concepts, object and data modelling techniques and design principles and conceptual knowledge of building and maintaining physical and logical data models.
  • Advanced expertise in Microsoft Azure Data Factory, SQL Analysis Server, SAP Data Services, SAP BTP.
  • Advanced understanding of data architecture landscape between physical and logical data models
  • Analytical mindset with excellent business acumen skills.
  • Problem-solving aptitude with the ability to communicate effectively, both written and verbal.
  • Ability to think strategically and build effective relationships at all levels within the organization.
  • Advanced expert in programing languages (Perl, bash, Shell Scripting, Python, etc.).
Academic Qualifications and Certifications :
  • Bachelor's degree or equivalent in computer science, software engineering, information technology, or a related field.
  • Relevant certifications preferred such as SAP, Microsoft Azure etc.
  • Certified Data Engineer, Certified Professional certification preferred.
Required experience :
  • Advanced demonstrated experience in data engineering, data mining within a fast-paced environment.
  • Proficient in building modern data analytics solutions that delivers insights from large and complex data sets with multi-terabyte scale.
  • Advanced demonstrated experience with architecture and design of secure, highly available and scalable systems.
  • Advanced proficiency in automation, scripting and proven examples of successful implementation.
  • Advanced proficiency in scripting languages (Perl, bash, Shell Scripting, Python, etc.).
  • Advanced demonstrated experience with big data tools like Hadoop, Cassandra, Storm etc.
  • Advanced demonstrated experience in any applicable language, preferably .NET.
  • Advanced proficiency in SAP, SQL, MySQL databases and Microsoft SQL.
  • Advanced demonstrated experience working with data sets and ordering data through MS Excel functions, e.g. macros, pivots.
  • Research / Data Analysis jobs
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.