Enable job alerts via email!

Data Engineer

CLOUD KINETICS CONSULTING PTE. LTD.

Singapore

On-site

SGD 70,000 - 90,000

Full time

Today
Be an early applicant

Job summary

A leading technology consulting firm in Singapore is seeking an experienced Data Engineer to develop and maintain data pipelines, ensuring the reliability and scalability of data processing. The ideal candidate will have over 3 years of hands-on experience with AWS and Azure, with strong proficiency in Python and SQL. Responsibilities include developing ETL frameworks and deploying pipelines for both batch and real-time processing.

Qualifications

  • 3+ years of hands-on data engineering experience in AWS.
  • Experience delivering at least 2 programs into production as a data engineer.
  • Understanding of data marts for reporting.

Responsibilities

  • Develop robust ETL pipeline and frameworks for batch and real-time data processing.
  • Deploy and monitor ETL Pipelines using orchestration tools.
  • Work with cloud-based data platforms for effective data processing.

Skills

Python
SQL
Data Warehousing
AWS Services
Azure Data Factory

Education

Bachelor’s or Master’s degree in Computer Science

Tools

Airflow
Snowflake
Informatica
Talend
Fivetran
Job description
Overview

We are seeking an experienced Data Engineer to join our Data team in Singapore. The ideal candidate will have a proven track record of hands-on data engineering experience, particularly within the AWS and Azure. As a Data Engineer, you will be responsible for developing and maintaining data pipelines, ensuring the reliability, efficiency, and scalability of our data lake and enabling data marts for AI models.

Responsibilities
  • Develop robust ETL pipeline and frameworks for both batch and real-time data processing using Python and SQL.
  • Deploying and Monitoring the ETL Pipelines using orchestration tools such as Airflow, DBT or AWS Services such as Glue Workflow, Step Functions, EventBridge.
  • Work with cloud-based data platforms like Redshift, Snowflake and Data Ingestion tools like DMS, ELT tools like dbt cloud for effective data processing.
  • Work with Azure data factory for building data pipelines
  • Implement CI/CD for ETLs and Pipeline to automate build and deployments
Qualifications
  • Bachelor’s or Master’s degree in computer science, Information Technology, or a related field.
  • 3+ years of hands-on data engineering experience in AWS
  • Should have delivered Atleast 2 programs into production as data engineer
Primary Skills
  • Proficient in Python, SQL and Data Warehousing Concepts
  • Develop ETL frameworks
  • Proficient in AWS services such as S3, DMS, Redshift, Glue, Kinesis, Athena, AWS Lambda, Step Functions to implement scalable data solutions.
  • Proficient in Azure data factory.
  • Working experience on Data Warehousing using Snowflake or AWS or Databricks
  • Should have understanding of data marts for presentation layer into reporting
Good-to-Have Skills
  • ETL development using tools like Informatica, Talend, Fivetran
  • CI/CD setup using GitHub or Bitbucket
  • Good Communication Skill
  • Good Knowledge in Data lake and data warehousing concepts
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.