Enable job alerts via email!

Backend Developer - BI & Data Engineering

Remote Recruitment

South Africa

Remote

ZAR 700 000 - 900 000

Full time

15 days ago

Job summary

A hiring firm is seeking a Senior Backend Developer to join their UK-based tech team from South Africa. This role involves designing ETL pipelines, optimizing data models, and developing components using Apache Spark and Scala. Ideal candidates will have over 5 years in ETL and strong experience with BI tools. Work remotely while contributing to innovative data solutions.

Qualifications

  • 5+ years of experience in ETL implementation.
  • 3+ years of hands-on experience with Apache Spark and Scala.
  • 2+ years working with BI tools like Power BI, Databricks, and Starburst.
  • Strong experience in data model design and working with Data Lakes.
  • Proven expertise in SQL and relational database systems.
  • Familiarity with CI/CD processes and Agile delivery environments.

Responsibilities

  • Design, build, and maintain scalable ETL pipelines using Apache Spark and Scala.
  • Implement and optimise data models within Data Lakes and relational databases.
  • Develop backend components for BI platforms, supporting tools like Power BI.
  • Ensure high-quality, testable code using static analysis tools.
  • Execute data validation and manage structured/unstructured datasets.
  • Collaborate with DevOps on CI/CD integration.

Skills

Apache Spark
Scala
BI tools
SQL
Data engineering
ETL processes

Tools

Power BI
Databricks
Starburst
Job description
Job Overview

Join a dynamic UK-based tech team seeking a highly experienced Senior Backend Developer with deep expertise in data engineering, BI, and backend development. This role offers the opportunity to work on cutting‑edge analytics and data platforms, using tools like Apache Spark, Scala, and Databricks. You'll be part of a fast‑paced environment that values clean code, efficient data pipelines, and innovative BI solutions that power strategic decisions. As a valued member of the team, you’ll collaborate with global colleagues, contribute to critical backend services, and help shape robust data architecture. If you're passionate about big data, ETL processes, and BI tools – this is your chance to make an impact with a UK employer, from the comfort of your home in South Africa.

Key Responsibilities
  • Design, build, and maintain scalable ETL pipelines using Apache Spark and Scala.
  • Implement and optimise data models within Data Lakes and relational databases.
  • Develop backend components for BI platforms, supporting tools like Power BI, Databricks, and Starburst.
  • Ensure high-quality, testable code using static analysis tools (Sonar, Fortify).
  • Execute data validation and manage structured/unstructured datasets in distributed environments.
  • Collaborate with DevOps on CI/CD integration to ensure seamless delivery processes.
  • Contribute to architectural decisions and performance tuning of complex data systems.
Qualifications and Experience
  • Minimum of 5 years’ experience in ETL implementation.
  • At least 3 years of hands‑on experience with Apache Spark and Scala.
  • Minimum of 2 years working with BI tools (e.g., Power BI, Databricks, Starburst).
  • Strong experience in data model design and working with Data Lakes (e.g., Apache Hive, AWS S3).
  • Proven expertise in SQL and relational database systems.
  • Experience in unit testing and code quality tools (e.g., Sonar, Fortify).
  • Familiarity with CI/CD processes and Agile delivery environments.
  • Equipment required: Personal laptop/desktop, reliable high-speed internet connection.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.