Job Search and Career Advice Platform

Enable job alerts via email!

Specialist Data Engineer

Absa Group

Gauteng

On-site

ZAR 400 000 - 600 000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A well-known financial institution is seeking a Data Engineer to work collaboratively across squads in designing and optimizing data solutions. The ideal candidate will have at least 3 years of experience and proficiency in Hadoop, with knowledge of Spark or AWS considered advantageous. Responsibilities include automating data delivery processes, building analytics tools, and ensuring the quality and governance of data. This role offers a chance to contribute to innovative data solutions within a dynamic team.

Qualifications

  • 3+ years of relevant experience in data engineering roles.
  • Proficiency in Hadoop and familiarity with Spark or AWS.
  • Experience with data lake formation.

Responsibilities

  • Work in multiple squads to develop data solutions.
  • Design and build analytics tools using data pipelines.
  • Automate data delivery processes and monitor performance.

Skills

Hadoop
Spark
AWS
In-house ETL tools
Data architecture

Education

Bachelor's Degree in Information Technology
Job description
Empowering Africa's tomorrow, together...one story at a time.

With over years of rich history and strongly positioned as a local bank with regional and international expertise, a career with our family offers the opportunity to be part of this exciting growth journey, to reset our future and shape our destiny as a proudly African group.

Job Summary

Work embedded as a member of squad OR; across multiple squads to produce, test, document and review algorithms & data specific source code that supports the deployment & optimisation of data retrieval, processing, storage and distribution for a business area.

Data Architecture & Data Engineering

Understand the technical landscape and bank wide architecture that is connected to or dependent on the business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)

Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem

Participate in design thinking processes to successfully deliver data solution blueprints

Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.

Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process

Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment

Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions

Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI / CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)

Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef

Debug existing source code and polish feature sets.

Assemble large, complex data sets that meet business requirements & manage the data pipeline

Build infrastructure to automate extremely high volumes of data delivery

Create data tools for analytics and data science teams that assist them in building and optimizing data sets for the benefit of the business

Ensure designs & solutions support the technical organisation principles of self-service, repeatability, testability, scalability & resilience

Apply general design patterns and paradigms to deliver technical solutions

Inform & support the infrastructure build required for optimal extraction, transformation, and loading of data from a wide variety of data sources

Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes

Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation

Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation's data

Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices.

Short term deployment must align to strategic long term delivery.

Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA's, IAAS, PAAS, SAAS, Containerisation etc.

Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions

Stay ahead of the curve on data processing, retrieval, storage & distribution technologies & processes (global best practices & trends) to ensure best practice

People
Coach & mentor other engineers

Conduct peer reviews, testing, problem solving within and across the broader team

Build data science team capability in the use of data solutions

Risk & Governance

Identify technical risks and mitigate these (pre, during & post deployment)

Update / Design all application documentation aligned to the organization technical standards and risk / governance frameworks

Create business cases & solution specifications for various governance processes (e.g. CTO approvals)

Participate in incident management & DR activity – applying critical thinking, problem solving & technical expertise to get to the bottom of major incidents

Deliver on time & on budget (always)

Skills & Experience Required

3+ years relevant experience

Proficiency in Hadoop required

Spark and / or AWS knowledge a distinct advantage

Experienced in lake formation

Ability to adapt to inhouse built ETL tools

Education

Bachelor's Degree : Information Technology

Absa Bank Limited is an equal opportunity, affirmative action employer.

In compliance with the Employment Equity Act 55 of , preference will be given to suitable candidates from designated groups whose appointments will contribute towards achievement of equitable demographic representation of our workforce profile and add to the diversity of the Bank.

Absa Bank Limited reserves the right not to make an appointment to the post as advertised

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.