Job Search and Career Advice Platform

Enable job alerts via email!

Snowflake Tech Lead

Sabenza IT & Recruitment

Remote

ZAR 600 000 - 800 000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading IT recruitment company is seeking a skilled data engineer for a remote position focused on Snowflake development. The ideal candidate will have a degree in IT and strong SQL and Python skills, coupled with experience in ETL/ELT tools and a solid understanding of data warehousing concepts. The role includes performance tuning, data quality assurance, and collaboration with BI teams. Excellent problem-solving and communication skills are essential for success in this dynamic role.

Qualifications

  • Strong SQL skills, including complex queries and performance tuning.
  • Proficient in Python for data processing.
  • Solid understanding of data warehousing concepts like Kimball and Data Vault.
  • Familiarity with cloud platforms, preferably Azure.
  • Knowledge of data governance and compliance with GDPR.
  • Excellent problem-solving skills and communication.

Responsibilities

  • Build and optimise Snowflake objects including databases and tables.
  • Develop and maintain robust ETL/ELT data pipelines.
  • Implement dimensional models and efficient structures for analytics.
  • Optimise queries and manage warehouse sizing.
  • Ensure data quality and accuracy through testing frameworks.
  • Apply security measures and comply with data governance standards.
  • Collaborate with BI teams for semantic alignment.
  • Maintain clear documentation for models and processes.

Skills

Snowpark
UDFs
dynamic tables
external tables
streaming/CDC
Kafka
Fivetran
Debezium
Power BI
Tableau
Looker

Education

Degree in IT
Matric

Tools

dbt
Airflow
Azure Data Factory
Informatica
Matillion
Job description

This is a remote position.

Responsibilities
  • Snowflake Development: Build and optimise Snowflake objects (databases, schemas, tables, views, tasks, streams, resource monitors).
  • ETL/ELT Pipelines: Develop and maintain robust data pipelines using tools like dbt, Airflow, Azure Data Factory, or similar.
  • Data Modelling: Implement dimensional models (star/snowflake schemas), handle SCDs, and design efficient structures for analytics.
  • Performance Tuning: Optimise queries, manage clustering, caching, and warehouse sizing for cost and speed.
  • Data Quality: Implement testing frameworks (dbt tests, Great Expectations) and ensure data accuracy and freshness.
  • Security & Governance: Apply RBAC, masking policies, and comply with data governance standards.
  • Collaboration: Work with BI teams to ensure semantic alignment and support self-service analytics.
  • Documentation: Maintain clear technical documentation for pipelines, models, and processes.
Qualifications
  • Matric and a Degree in IT
  • Strong SQL skills (complex queries, performance tuning) and proficiency in Python for data processing.
  • Experience with ETL/ELT tools (dbt, Airflow, ADF, Informatica, Matillion).
  • Solid understanding of data warehousing concepts (Kimball, Data Vault, normalization).
  • Familiarity with cloud platforms (Azure preferred; AWS/GCP acceptable).
  • Knowledge of data governance, security, and compliance (GDPR).
  • Excellent problem‑solving and communication skills.
Skills
  • Experience with Snowpark, UDFs, dynamic tables, and external tables.
  • Exposure to streaming/CDC (Kafka, Fivetran, Debezium).
  • BI tool integration (Power BI, Tableau, Looker).
  • Certifications: SnowPro Core or Advanced.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.