Enable job alerts via email!

Snowflake Data Engineer

BLUE OCEAN SYSTEMS INFOTECH PTE. LTD.

Singapore

On-site

SGD 80,000 - 120,000

Full time

4 days ago
Be an early applicant

Job summary

A tech company in Singapore is hiring a Snowflake Data Engineer to design and optimize data pipelines. The ideal candidate should have 5+ years of experience, proficiency in Snowflake and SQL, and be familiar with cloud platforms. Responsibilities include implementing ETL workflows and collaborating with analysts. The role offers an opportunity for skilled professionals to work with modern data tools and practices.

Qualifications

  • 5+ years of experience in data engineering or a similar role.
  • 2+ years of hands-on experience with Snowflake.
  • Proficient in SQL with a deep understanding of analytical queries.
  • Experience with ETL/ELT tools such as dbt, Informatica, Talend, etc.
  • Strong knowledge of data modeling concepts (Star/Snowflake schemas).
  • Experience with cloud platforms and integrating Snowflake with cloud storage.

Responsibilities

  • Design and implement scalable data pipelines using Snowflake.
  • Develop and maintain ETL/ELT workflows.
  • Optimize Snowflake performance related to queries.
  • Collaborate with analysts and stakeholders for requirements.
  • Implement data governance and security best practices.
  • Monitor and troubleshoot data pipeline issues.
  • Create documentation for data processes and architecture.

Skills

Snowflake
SQL
ETL/ELT tools
Data modeling concepts
Python or Scala
Problem-solving skills

Tools

AWS
Azure
GCP
Power BI
Tableau
Looker

Job description

Hi Immediate hiring

Job Overview:

We are seeking a highly skilled Snowflake Data Engineer to join our data engineering team. The ideal candidate will have hands-on experience with Snowflake, strong SQL skills, and a solid understanding of cloud-based data warehousing, ETL processes, and data modeling. The role will focus on designing, developing, and optimizing scalable data pipelines using Snowflake and other modern data stack tools.

Key Responsibilities:

  • Design and implement scalable and efficient data pipelines using Snowflake.
  • Develop and maintain ETL/ELT workflows to ingest, transform, and deliver clean data.
  • Optimize Snowflake performance (query tuning, resource management, clustering).
  • Collaborate with data analysts, data scientists, and business stakeholders to gather requirements and deliver robust data solutions.
  • Implement data governance, security, and access control best practices within Snowflake.
  • Monitor and troubleshoot data pipeline issues and performance bottlenecks.
  • Create and maintain documentation related to data processes and architecture.

Required Skills & Qualifications:

  • 5+ years of experience in data engineering or a similar role.
  • 2+ years of hands-on experience with Snowflake (SnowSQL, Streams, Tasks, Cloning, etc.).
  • Proficient in SQL, with deep understanding of analytical and complex queries.
  • Experience with ETL/ELT tools such as dbt, Informatica, Talend, Matillion, or Apache Airflow.
  • Strong knowledge of data modeling concepts (Star/Snowflake schemas).
  • Experience with cloud platforms (AWS, Azure, or GCP) and integrating Snowflake with cloud storage (S3, Blob, GCS).
  • Familiarity with programming languages like Python or Scala for scripting and automation.
  • Strong problem-solving skills and attention to detail.

Preferred Qualifications:

  • Snowflake certification (SnowPro Core or Advanced Architect) is a plus.
  • Experience with CI/CD, version control (Git), and DevOps practices.
  • Familiarity with BI tools like Power BI, Tableau, or Looker.

Regards

Kshama

+91 9833964181

kshama.raj@blueocean.systems

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.