Enable job alerts via email!

Senior Developer

Arganteal, Corp.

Cape Town

On-site

ZAR 800 000 - 1 200 000

Full time

Today
Be an early applicant

Job summary

A tech company in data science is seeking a Senior Developer specializing in Data & AI in Cape Town. You will develop a cutting-edge platform that processes multi-modal sensor data. The ideal candidate has strong Python proficiency and experience with various data storage solutions. This full-time role offers a chance to work on innovative data solutions within a collaborative team environment.

Qualifications

  • Strong proficiency in Python, with C++ as a plus.
  • 5+ years professional software development experience.
  • Proven experience designing and deploying modular platforms.

Responsibilities

  • Architect, develop, and deploy core modules for data ingestion.
  • Implement real-time ingestion pipelines using Apache Kafka.
  • Collaborate with frontend engineers for UI integration.

Skills

Strong proficiency in Python
Hands-on experience with Kafka
Experience with Redis Streams
MongoDB experience
Neo4j experience
Knowledge of Docker
Strong debugging skills

Education

Bachelor's degree in Computer Science or related field

Tools

Docker
Kubernetes
AWS
Job description
Recruitment Policy & Eligibility

Arganteal accepts applications from direct candidates only. We do not work with third‑party recruiters or staffing agencies.

Required Country Location: Costa Rica, Peru, Argentina, Brazil, Columbia, South Africa, Mexico, or Panama.

This is a full‑time role (40 hours per week).

Overview

Our client seeks a motivated Senior Developer, Data & AI to join their team in developing a groundbreaking, modular platform built from the ground up. Our client digitizes and contextualizes multi‑modal sensor data from both digital and physical environments into specialized time‑series, graph, and vector databases, powering real‑time analytics, compliance, and AI‑driven context mapping.

Key Responsibilities
  • Platform Design & Development: Architect, develop, and deploy core modules (Data, Access, & Agent's) for end‑to‑end data ingestion, contextualization, and visualization.
  • Design and code sensor collection agents across heterogeneous systems (Windows, Linux, macOS, mobile, IoT).
  • Implement real‑time ingestion pipelines using technologies like Apache Kafka, Apache NiFi, Redis Streams, or AWS Kinesis.
  • Persist and query multi‑modal data across time‑series (MongoDB, InfluxDB, TimescaleDB), graph (Neo4j), and vector databases (Qdrant, FAISS, Pinecone, or Weaviate).
  • Build secure, scalable RESTful and GraphQL APIs for exposing platform data models, sensor configuration, and reporting.
  • Implement a unified Database Access Layer (DBAL) to abstract query logic across multiple databases.
  • Experiment with or extend Model Context Protocol (MCP) or a similar standardized data interchange for multi‑DB, multi‑agent interoperability.
  • Develop low‑latency data pipelines for transporting and transforming event streams (syslog, telemetry, keystrokes, IoT feeds, cloud service logs).
  • Collaborate with frontend engineers to connect Access (visual mapping UI) with back‑end pipelines.
  • Optimize database query performance using down‑sampling, partitioning, and caching techniques.
  • Design solutions for horizontal scaling and containerized deployment (Docker, Kubernetes, OpenShift).
  • Apply a "MacGyver‑mindset" for rapid prototyping and iterative refinement under real‑world constraints.
  • Work directly with compliance officers, security analysts, and business process owners to refine data models for regulatory and operational needs.
  • Conduct code reviews, mentor junior developers, and promote best practices across the team.
Required Skills & Experience
  • Programming: Strong proficiency in Python (C++ a plus).
  • Streaming: Hands‑on experience with Kafka, NiFi, Redis Streams, or AWS Kinesis.
  • Databases:
    • Time‑series: MongoDB, InfluxDB, TimescaleDB, or AWS Timestream.
    • Graph: Neo4j (Cypher, APOC, graph schema design).
    • Vector: Qdrant, FAISS, Pinecone, or Weaviate.
  • AI / Agents: Experience with—or strong interest in—Agentic AI frameworks, multi‑agent orchestration, and context‑aware data processing.
  • Data Interchange: Familiarity with MCP‑like protocols or interest in defining standardized APIs for cross‑database access.
  • Cloud / Infra: AWS, Azure, or GCP with containerization (Docker, Kubernetes).
  • Software Engineering: Strong grasp of algorithms, distributed systems, micro‑service design, and API security.
  • Problem Solving: Strong debugging skills, creative mindset, and ability to balance speed with scalability.
Preferred Skills
  • Machine Learning / NLP integration into multi‑modal pipelines.
  • CI/CD automation and DevOps practices.
  • Knowledge of enterprise integration patterns, event‑driven systems, and zero‑trust security models.
  • Experience with compliance frameworks (NERC CIP, FedRAMP, GDPR, SOX).
Qualifications
  • Bachelor's degree in Computer Science, Engineering, or related field (or equivalent hands‑on experience).
  • 5+ years professional software development with data‑intensive or AI‑driven systems.
  • Proven experience designing, deploying, and scaling modular platforms in production.
Notice

We do not work with third‑party recruiters or staffing agencies.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.