Enable job alerts via email!

Senior Data Engineer

Sabenza IT & Recruitment

Cape Town

On-site

ZAR 60,000 - 85,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Start fresh or import an existing resume

Job summary

A leading company is seeking a Senior Data Engineer to spearhead the design and optimization of data pipelines and cloud data platforms. This role demands expertise in ETL processes with tools like Matillion and Snowflake, alongside collaboration in data workflows involving machine learning and advanced data governance practices. Ideal candidates will possess extensive experience and a robust technical background in data architecture.

Qualifications

  • 8+ years of experience in data engineering, data warehousing, and integration.
  • Advanced expertise with Matillion ETL and Snowflake.
  • Strong programming skills in Python, Java, or Scala.

Responsibilities

  • Design and manage ETL / ELT pipelines using Matillion and Snowflake.
  • Build scalable RESTful APIs for data integration.
  • Collaborate on data workflows with Data Scientists and ML Engineers.

Skills

Data modeling
Data governance
API development
ETL / ELT processes
Machine learning workflows
Python
Java
Scala
SQL
DevOps methodologies

Education

Bachelor's or Master's degree in Computer Science, Engineering, or related field

Tools

Matillion
Snowflake
Databricks
AWS
Azure
GCP

Job description

We are looking for a highly experienced Senior Data Engineer to lead the design, development, and optimization of scalable data pipelines, APIs, and cloud data platforms. This pivotal role will focus on ETL / ELT processes using Matillion and Snowflake , as well as integration with Databricks and machine learning workflows. The ideal candidate is a data engineering expert with deep knowledge of modern data architecture, cloud platforms, and API development.

Key Responsibilities :

Design, develop, and manage ETL / ELT pipelines using Matillion and Snowflake.

Build and maintain scalable, secure RESTful APIs for data integration.

Collaborate with Data Scientists and ML Engineers to integrate data workflows with ML pipelines on Databricks.

Optimize Snowflake data warehouse performance and maintain data models.

Apply best practices for data quality, governance, and security.

Automate data validation and reconciliation processes.

Document architecture, processes, and technical designs.

Minimum Qualifications :

Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

8+ years of experience in data engineering, data warehousing, and data integration.

Required Skills and Experience :

Advanced expertise with Matillion ETL and Snowflake .

Proficiency in Databricks and machine learning workflow integration.

Strong programming skills in Python , Java , or Scala .

Experience with API development frameworks and data platform integration.

Deep understanding of data modeling , warehousing , and ELT best practices .

Proficiency with SQL and CI / CD pipelines, Git, and DevOps methodologies.

Familiarity with cloud environments such as AWS, Azure, or GCP.

Strong understanding of data governance , security , and compliance frameworks.

Hit apply today for more information!

Create a job alert for this search
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.