Enable job alerts via email!

Data Engineer – Snowflake

Second Talent

Daerah Khusus Ibukota Jakarta

On-site

IDR 200.000.000 - 300.000.000

Full time

2 days ago
Be an early applicant

Job summary

A data-focused tech company in Jakarta is seeking an experienced Data Engineer to design and optimize scalable data pipelines using Snowflake and cloud services. The ideal candidate should have 3-5 years in data engineering, strong skills in Snowflake and cloud platforms, and expertise in SQL and Python. This role offers an opportunity to contribute to a collaborative culture focused on data-driven decision-making.

Qualifications

  • 3–5 years of experience as a Data Engineer or Data Warehouse Engineer.
  • Strong hands-on experience with Snowflake.
  • Proficient in SQL and one scripting language, preferably Python.

Responsibilities

  • Design, build, and maintain ETL/ELT pipelines into Snowflake.
  • Collaborate with analysts and teams to deliver reliable datasets.
  • Optimize query performance and warehouse sizing.

Skills

Snowflake (Snowpipe, virtual warehouses)
AWS
SQL
Python
ETL/ELT
Data Modeling
Data Warehousing

Tools

AWS Glue
Azure Data Factory
GCP Cloud Composer
Job description

Position: Data Engineer – Snowflake & Cloud

Location: Jakarta Selatan, Indonesia (Equity Tower, SCBD)

Contract: 8–12 Months

Notice Period: Immediate / 2 weeks

About Our Client:

Our client is a fast-growing, innovative company in the data and analytics space, focused on building scalable cloud-based data platforms to empower business and analytics teams. They leverage modern Snowflake architecture and cloud infrastructure to drive data-driven decision-making.

Role Overview:

We are looking for an experienced Data Engineer with strong hands‑on expertise in Snowflake and cloud platforms (AWS, Azure, or GCP). You will design, implement, and maintain scalable data pipelines, optimize Snowflake performance, and integrate cloud‑based datasets to support analytics and business teams.

Key Responsibilities:
  • Design, build, and maintain ETL/ELT pipelines into Snowflake from multiple data sources.
  • Develop and manage Snowflake warehouses, schemas, roles, tasks, Snowpipe, stages, and data marts.
  • Integrate Snowflake with cloud data storage (AWS S3, Azure Blob Storage, or GCP Cloud Storage).
  • Use cloud services (AWS Glue, Azure Data Factory, GCP Cloud Composer, Lambda/Functions) to automate and orchestrate data loading.
  • Optimize query performance, warehouse sizing, credit usage, clustering, and partitioning.
  • Design and maintain data models (star and snowflake schema) and data marts.
  • Implement monitoring, logging, data validation, and error‑handling across pipelines.
  • Ensure data governance, security (RBAC/IAM), and documentation best practices.
  • Collaborate with analysts, data scientists, and business teams to deliver clean, reliable datasets.
Must‑Have Requirements:
  • 3–5 years of experience as a Data Engineer or Data Warehouse Engineer.
  • Strong hands‑on experience with Snowflake (Snowpipe, virtual warehouses, stages, roles, performance tuning).
  • Experience with at least one cloud platform — AWS, Azure, or GCP.
  • Proficient in SQL and one scripting language (Python preferred).
  • Strong understanding of data warehousing, ETL/ELT processes, and data modeling.
Good to Have:
  • Experience with MSSQL, PostgreSQL, or other RDBMS.
  • Familiarity with Airflow, dbt, SSIS, Azure Data Factory, AWS Glue, or similar orchestration tools.
  • Knowledge of Terraform / CloudFormation for automated deployment.
  • Experience with BI tools such as Power BI, Tableau, or Looker.
  • Basic understanding of data governance, masking, encryption, or GDPR compliance.
Soft Skills:
  • Strong analytical and problem‑solving skills.
  • Detail‑oriented and documentation-focused.
  • Able to communicate clearly with both technical and non‑technical teams.
  • Independent, proactive, and collaborative.
Why Join:

Work on modern Snowflake + cloud data infrastructure. Opportunity to lead data platform improvements and migrations. Collaborative culture with high ownership and innovation space.

Must‑Have Skills:

Snowflake (Snowpipe, virtual warehouses, stages, roles, performance tuning), AWS/Azure/GCP, SQL, Python, ETL/ELT, Data Modeling, Data Warehousing

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.