Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer-SD

TOSS-EX PR PTE. LTD.

Singapore

On-site

SGD 60,000 - 80,000

Full time

3 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A tech company based in Singapore is looking for a skilled data engineer. The role involves building and maintaining data pipelines, ensuring data quality, and collaborating with teams to enhance scalability and automation. Candidates should possess a degree in Computer Science or a related field, along with strong skills in Python or Java. Experience with data modeling, SQL, and tools like Databricks and Azure Data Factory is highly valued. This position offers an opportunity to work in a fast-paced environment.

Qualifications

  • Hands-on experience with Python or Java and understanding of OOP principles.
  • Proficient in SQL for querying and transforming structured data.
  • Experience with enterprise data warehouses or data lakes.
  • Familiarity with data pipelines and transformation workflows.

Responsibilities

  • Build, test, and maintain data pipelines and transformation workflows.
  • Support data integration and processing for cross-functional projects.
  • Perform data quality validation and troubleshooting.
  • Assist with datasets and documentation for analytical needs.
  • Collaborate to improve platform scalability and automation.

Skills

Python
SQL
Java
Data modeling
ETL Best Practices
Collaborative team work

Education

Bachelor’s or Master’s degree in Computer Science or Information Technology

Tools

Databricks
Azure Data Factory
Apache Spark
Hive
Delta Lake
Job description
Responsibilities
  • Build, test, and maintain data pipelines andtransformation workflows on CapitaLand’s Enterprise Data Platform (EDP)
  • Support data integration and processing tasks forcross-functional business projects
  • Perform data quality validation and troubleshooting toensure data reliability and consistency
  • Assist in the preparation of datasets and documentation tosupport analytical and operational needs
  • Collaborate with team members to improve platformscalability, automation, and best practices
Must-have Requirements
  • Bachelor’s or Master’s degree in Computer Science,Information Technology, or a related field
  • Hands-on experience with Python or Java, with a solidunderstanding of object-oriented programming (OOP) principles
  • Proficient in SQL for querying and transforming structureddata
  • Strong knowledge of data modeling and ETL best practices
  • Experience working with enterprise data warehouses or datalakes
  • Familiarity with data pipelines and batch/stream dataprocessing workflows
  • Strong communication skills with the ability to workindependently and collaboratively in a fast-paced environment
Good-to-have Requirements
  • Exposure to cloud platforms such as Microsoft Azure
  • Experience with tools like Databricks, Azure Data Factory,Apache Spark, Hive, or Delta Lake
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.