Enable job alerts via email!

ETL Engineer

FLINTEX CONSULTING PTE. LTD.

Singapore

On-site

SGD 60,000 - 90,000

Full time

16 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

A leading company is seeking a Data Engineer in Singapore to manage the extract/transform/load processes. Responsibilities include designing efficient ETL pipelines in Azure Data Factory, ensuring data accuracy, and participating in data modeling. The ideal candidate should have 3-5 years of experience in data engineering, proficiency in analytics languages, and familiarity with big data technologies like Snowflake and Hadoop.

Qualifications

  • 3-5 years of experience in data engineering.
  • Understanding of retail/Supply Chain/Manufacturing domains.
  • Proficiency in data analytics languages and big data technologies.

Responsibilities

  • Design and manage ETL processes, ensuring data flow and integrity.
  • Optimize ETL pipelines for efficiency and monitor performance.
  • Collaborate with departments for data requirements and documentation.

Skills

Data Quality Principles
Data Governance Best Practices
Python
Java
Scala
Hadoop
Spark
Distributed Computing
Version Control Systems

Education

SnowPro Core Certification
SnowPro Advanced Certification

Tools

Git

Job description

Objectives of this position:

The objective of the position is to manage the extract/transform/load processes ensuring the data availability.

Responsibilities:

The holder of the position is mainly responsible for the following areas in coordination with his / her superior:

•Design, create, modify extract/transform/load (sETL) pipelines in Azure Data Fac-tory ensuring efficient data flow from source to destination.

•Ensure data accuracy and data integrity throughout the ETL processes via data validation, cleansing, deduplication, and error handling to ensure reliable and us-able data being ingested.

•Monitor the ETL processes and optimize ETL pipelines for speed and efficiency, addressing bottlenecks, and ensuring the ETL system can handle the volume, ve-locity, and variety of data.

•Participate in data modeling, designing of the data structures and schema in the data warehouse to optimize query performance and align with business needs.

•Work closely with different departments and IT team to under-stand data requirements and deliver the data infrastructure that supports business goals.

•Provide technical support for ETL systems, troubleshooting issues and ensuring the continuous availability and reliability of data flows.

•Ensure proper documentation of data sources, ETL processes and data architecture.

Requirements:

3 to 5 years of data engineering in Snowflake

3 to 5 years in upstream / downstream Retail industry and/or Supply Chain / Manufacturing domain

Sound Understanding of data quality principles and data governance best practices

Proficiency in data analytics languages like Python, Java, Scala, etc.

Knowledge of big data technologies like Hadoop, Spark and distributed computing frameworks to manage large scale data processing.

Proficient in using version control systems like Git for managing code and configurations.

SnowPro Core Certification and SnowPro Advanced Certification will be advantage

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.