Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Mi-C3 International

Johannesburg

On-site

ZAR 200 000 - 300 000

Full time

9 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A forward-thinking technology company in Johannesburg seeks a mid-level Data Integration Engineer to design and implement real-time data integration pipelines using Apache NiFi. The ideal candidate will have strong skills in Java and Python, with the ability to handle data from diverse sources. The role offers opportunities for professional development in a collaborative environment where innovation is valued.

Benefits

Professional development opportunities
Comprehensive compensation package
Supportive work environment

Qualifications

  • Hands-on experience in designing and implementing real-time data integration solutions.
  • Strong analytical and problem-solving abilities.
  • Excellent communication and teamwork skills.

Responsibilities

  • Design and implement data streaming solutions using Apache NiFi.
  • Ingest data from diverse sources including IoT protocols and APIs.
  • Develop and maintain ETL pipelines for data analysis.

Skills

Apache NiFi
Java
Python
Data pipelines
ETL workflows
Kafka
RabbitMQ
MQTT
SNMP
WebSockets

Education

Bachelor's degree in Computer Science, Information Technology, or a related field

Tools

Apache Spark
NiFi
Fluvio
Rust
Job description
About MI-C3
Established in
  • by CEO and founder Glen Scott, MI-C3 International Limited is a Malta-based company specializing in delivering trusted software solutions tailored for mission-critical environments.

Our flagship product, AFFECTLI, empowers organizations to make informed, data-driven decisions by providing a consolidated, real-time view of complex operations.

We pride ourselves on fostering a collaborative, agile work environment that celebrates diversity, rewards innovation, and values continuous improvement.

Data Integrations Engineer (NiFi)

We are seeking a mid-level Data Integration Engineer with hands‑on experience in Apache NiFi to join our dynamic team.

In this role, you will design, implement, and maintain real‑time data integration pipelines, handling data from diverse sources such as IoT / IIoT devices, third‑party APIs, and raw files.

Your primary focus will be on processing streaming data to provide valuable insights that drive informed decision‑making within our organization.

As MI-C3 transitions towards Fluvio and Rust, experience with these technologies will be advantageous but is not mandatory.

The ideal candidate will possess a deep understanding of data pipelines, real‑time event streaming, and ETL workflows, coupled with a passion for exploring and implementing new technologies.

Key Responsibilities

Collaborate with cross‑functional teams to design and implement scalable, real‑time data streaming solutions using Apache NiFi.

Ingest and process data from various sources, including IoT / IIoT protocols (e.g., MQTT, SNMP, CoAP, TCP, WebSockets) and third‑party APIs.

Develop and maintain robust ETL pipelines, ensuring data is transformed and loaded efficiently for analysis and storage.

Continuously monitor and optimise data workflows to maintain low‑latency, high‑throughput processing capabilities.

Configure and manage message brokers such as Kafka, RabbitMQ, and AMQP to facilitate efficient data exchange and support event‑driven architectures.

Implement validation checks and quality measures to ensure the accuracy, reliability, and integrity of integrated data.

Proactively identify, diagnose, and resolve issues related to data ingestion, transformation, and streaming processes to ensure uninterrupted data flow.

Technical Requirements

Demonstrated experience in designing and implementing data integration solutions using Apache NiFi for real‑time streaming data.

Strong skills in Java and Python for developing custom data processing components and applications.

Familiarity with tools such as Apache Spark and Kafka for building scalable data integration solutions.

Experience configuring and managing message brokers like RabbitMQ, AMQP, and Kafka to enable efficient data exchange.

Hands‑on experience with protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets for data capture from edge devices and industrial systems.

Knowledge of data validation techniques and quality assurance practices to ensure reliable data integration.

Strong analytical and problem‑solving abilities, with a keen attention to detail.

Excellent communication and teamwork skills to effectively collaborate with cross‑functional teams.

A proactive mindset with a willingness to learn and work with new tools and technologies, including Fluvio and Rust.

Preferred Qualifications

Bachelor's degree in Computer Science, Information Technology, or a related field.

Familiarity with Fluvio and Rust is a plus.

Experience with cloud‑based platforms and distributed systems is advantageous.

Understanding of embedded systems, requirements engineering, and systems integration is beneficial.

What We Offer

Be part of a forward‑thinking company that values innovation and continuous improvement.

Opportunities for professional development and career advancement within a growing organization.

A supportive and inclusive work environment that values diversity and collaboration.

A comprehensive compensation package commensurate with experience and qualifications.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.