Enable job alerts via email!

Data Engineer

Kandua

Johannesburg

Remote

ZAR 400 000 - 800 000

Full time

9 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join an innovative company that is transforming the service industry in South Africa! As a Data Engineer, you will play a crucial role in defining and scaling data capabilities, designing robust data pipelines, and collaborating with cross-functional teams to convert raw data into actionable insights. This position offers the chance to work remotely, with flexible arrangements, and to be part of a fast-growing startup that values innovation and problem-solving. If you are passionate about data quality and building sustainable systems, this is the perfect opportunity to make a significant impact.

Benefits

Flexible work arrangements
Remote work opportunities
Cutting-edge cloud technologies
Opportunity to shape DevOps culture

Qualifications

  • 6+ years in data engineering or software engineering roles focusing on data infrastructure.
  • Strong SQL skills and proficiency with modern data stack tools.

Responsibilities

  • Design and manage scalable data pipelines using ELT/ETL techniques.
  • Collaborate with product teams to develop version-controlled data models.

Skills

SQL
Data Pipeline Development
Data Quality Assurance
Problem Solving
Programming Skills

Education

Bachelor's Degree in Computer Science or related field

Tools

Airflow
Kafka
Snowflake
BigQuery
dbt

Job description

About Kandua

The Kandua Company helps small service businesses grow. We connect them to new customers and simplify business management with easy-to-use tech tools. Kandua.com is South Africa’s #1 online marketplace for home services. Every month, over 40,000 vetted home service professionals access around R50 million worth of work opportunities from individual customers, along with business customers through Kandua’s partnerships with leaders in insurance and retail.

The Kandua for Pros app provides a mobile platform for professionals to send quotes and invoices, accept payments, track customer communication, and view business performance, all securely stored in the cloud. Our mission is to use technology to bridge the gap between skills and livelihood, supporting those who serve us daily.

What does this role involve?

We are seeking a pragmatic and forward-thinking Data Engineer to define and scale our data capabilities. This role involves designing, building, and maintaining data pipelines, models, and infrastructure that support our analytics, operations, and product personalization. You will have the chance to shape our modern data stack and collaborate closely with engineering, product, operations, and growth teams to convert raw data into actionable insights and automated workflows.

Key Responsibilities
  1. Design, build, and manage reliable, scalable data pipelines using batch and streaming techniques (e.g., ELT/ETL, Kafka, Airflow).
  2. Own and evolve the structure and architecture of our Data Lakehouse and medallion architecture.
  3. Develop robust processes for data ingestion, validation, transformation, and delivery across multiple systems and sources.
  4. Implement tools and frameworks for monitoring, testing, and ensuring data quality, lineage, and observability.
  5. Collaborate with analytics and product teams to develop well-documented, version-controlled data models that support reporting, dashboards, and experiments.
  6. Leverage geospatial and behavioral data to create features for search, matching, and personalization algorithms.
  7. Partner with the ML team to support deployment of machine learning workflows and real-time inference pipelines.
  8. Research and prototype emerging tools and technologies to enhance Kandua’s data stack and developer experience.
  9. Monitor and support production workloads to ensure performance, availability, and cost-efficiency.
What We're Looking For
  • 6+ years of experience in a data engineering or software engineering role, focusing on data infrastructure and pipelines.
  • Strong SQL skills.
  • Proficiency with modern data stack tools (e.g., dbt, Airflow, Spark, Kafka, Delta Lake, Snowflake/BigQuery).
  • Solid experience with cloud platforms and infrastructure-as-code tools.
  • Strong programming skills.
  • Deep understanding of relational databases, data warehousing, and data modeling best practices.
  • Passion for data quality, testing, documentation, and building sustainable systems.
  • Familiarity with data modeling, OLAP cubes, and multidimensional databases.
  • Experience with data pipelines and ETL/ELT processes.
  • Solutions-oriented mindset and strong problem-solving skills.
  • Ownership and accountability for the quality and accuracy of insights.
Nice-to-have Skills
  • Experience with BigQuery and Dataform.
  • Familiarity with Google Cloud Platform (GCP) or other cloud providers.
  • Exposure to Domain-Driven Design (DDD).
  • Experience working in a startup or fast-paced environment.
  • Hands-on experience with cloud-based data warehouses (preferably Google BigQuery).
Why Join Kandua?
  • Be part of a fast-growing startup solving real problems in South Africa.
  • Work remotely with a talented team.
  • Opportunity to shape the DevOps culture and practices.
  • Flexible work arrangements.
  • Work with cutting-edge cloud technologies and best practices.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.