Enable job alerts via email!

SENIOR DATA ENGINEER – REMOTE

Tych Business Solutions

South Africa

Hybrid

ZAR 800,000 - 1,200,000

Full time

Yesterday
Be an early applicant

Job summary

A solutions provider in South Africa is seeking a Data Engineer with over 10 years of experience in ETL development. The successful candidate will design and maintain ETL pipelines using Azure Data Factory and Databricks, ensuring data governance and integration across multiple environments. Strong skills in SQL, Python, and data modelling are essential. This role offers the opportunity to mentor junior engineers and collaborate with cross-functional teams.

Qualifications

  • 10+ years experience in data engineering and ETL development.
  • Strong expertise in Azure Data Factory and Databricks.
  • Solid background in SQL scripting and performance tuning.

Responsibilities

  • Create and extend existing data models for Analytics teams.
  • Design, develop, and maintain ETL pipelines.
  • Implement data movement and transformation across systems.

Skills

Azure Data Factory
Databricks
SQL scripting
Python
Data modelling
ETL development

Education

Tertiary qualification

Tools

Pentaho
SQL Server
Azure Synapse Analytics
Git
Azure DevOps
Job description
Responsibilities
  • Create and/or extend existing data models to include the data for consumption by the Analytics teams.
  • Apply the relevant business and technical rules in the ETL jobs to correctly move data.
  • Use the SDLC as defined including testing and aligning to release management in CCBA.
  • Must produce design documents that can be reviewed Design Authority.
  • Build must align to Standards as defined by Enterprise Architecture.
  • Includes KT, Hypercare and PGLS of work delivered.
  • Design, develop, and maintain ETL pipelines using Azure Data Factory and Databricks.
  • Implement data movement and transformation across cloud, on-premises, and hybrid systems.
  • Ensure seamless data exchange and integration using Azure Synapse Analytics, Azure Data Lake, and SQL Server.
  • Develop and consume RESTful and SOAP APIs for real-time and batch data integration.
  • Work with API gateways and secure authentication methods (OAuth, JWT, API keys, certificates).
  • Apply data validation, cleansing, and enrichment techniques.
  • Execute reconciliation processes to ensure data accuracy and completeness.
  • Adhere to data governance and security compliance standards.
  • Troubleshoot ETL failures and optimize SQL queries and stored procedures.
  • Provide operational support and enhancements for existing data pipelines.
  • Partner with data analysts, business analysts, and stakeholders to understand data needs.
  • Document data workflows, mappings, and ETL processes for maintainability.
  • Share best practices and mentor junior engineers.
Experience
  • Matric and a tertiary qualification.
  • Experience in large-scale enterprise data integration projects.
  • 10+ years in data engineering, ETL development, and SQL scripting.
  • Strong expertise in Azure Data Factory, Databricks, Synapse, and Pentaho.
  • Proficiency in SQL, Python, PySpark, and performance tuning.
  • Experience with Git, Azure DevOps, and CI/CD pipelines.
  • Solid understanding of data modelling, warehousing, and governance.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.