Enable job alerts via email!

Data Engineer – DBT / Snowflake

TEKISHUB CONSULTING SERVICES PTE. LTD.

Singapore

On-site

SGD 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A leading consulting firm based in Singapore is seeking a skilled Data Engineer to design, develop, and maintain data transformation pipelines. The ideal candidate will have expertise in DBT, Snowflake, and PL/SQL, with a minimum of 3 years in data engineering. Responsibilities include optimizing data models, collaborating with teams, and ensuring data quality. This position offers a dynamic work environment and opportunities for professional development.

Qualifications

  • Minimum 3 years’ experience in data engineering or related fields.
  • Hands-on expertise with DBT (modular SQL development, testing, documentation).
  • Proficiency in Snowflake (data warehousing, performance tuning, security).
  • Strong knowledge of PL/SQL, including stored procedures and functions.
  • Solid understanding of data modeling (star/snowflake schemas, normalization).
  • Experience with version control (Git) and CI/CD practices.
  • Familiarity with Airflow, dbt Cloud, or Prefect is an advantage.

Responsibilities

  • Design and implement scalable data models and transformation pipelines using DBT on Snowflake.
  • Write efficient PL/SQL code for complex data processing and transformation tasks.
  • Collaborate with data analysts, data scientists, and business stakeholders to translate requirements into robust data solutions.
  • Optimize Snowflake performance through query tuning, clustering, and resource management.
  • Ensure data quality, integrity, and governance through testing, documentation, and monitoring.
  • Participate in code reviews and architecture discussions to improve data engineering best practices.
  • Maintain and enhance CI/CD pipelines for DBT projects.

Skills

DBT (Data Build Tool)
Snowflake
PL/SQL
Data modeling
Git
CI/CD practices
Airflow
Job description
Role Overview

We are seeking a skilled Data Engineer with strong expertise in DBT (Data Build Tool), Snowflake, and PL/SQL. The selected candidate will design, develop, and maintain data transformation pipelines supporting business intelligence, analytics, and data-science initiatives across the enterprise.

Key Responsibilities
  • Design and implement scalable data models and transformation pipelines using DBT on Snowflake.
  • Write efficient PL/SQL code for complex data processing and transformation tasks.
  • Collaborate with data analysts, data scientists, and business stakeholders to translate requirements into robust data solutions.
  • Optimize Snowflake performance through query tuning, clustering, and resource management.
  • Ensure data quality, integrity, and governance through testing, documentation, and monitoring.
  • Participate in code reviews and architecture discussions to improve data engineering best practices.
  • Maintain and enhance CI/CD pipelines for DBT projects.
Required Skills & Experience
  • Minimum 3 years’ experience in data engineering or related fields.
  • Hands-on expertise with DBT (modular SQL development, testing, documentation).
  • Proficiency in Snowflake (data warehousing, performance tuning, security).
  • Strong knowledge of PL/SQL, including stored procedures and functions.
  • Solid understanding of data modeling (star/snowflake schemas, normalization).
  • Experience with version control (Git) and CI/CD practices.
  • Familiarity with Airflow, dbt Cloud, or Prefect is an advantage.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.