Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

MINDTECK SINGAPORE PTE LTD

Singapore

On-site

SGD 60,000 - 90,000

Full time

Today
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading data solutions company in Singapore is seeking a skilled Data Engineer to develop, maintain, and optimize data systems. The ideal candidate will possess strong programming skills in Python and SQL, with experience in cloud platforms such as AWS, Azure, or GCP. A Polytechnic diploma or bachelor’s degree in computer science or a related field is required. This role offers the opportunity to work with diverse data technologies and collaborate with various stakeholders.

Qualifications

  • Strong Python and SQL skills are essential.
  • Experience with major cloud providers like AWS, Azure, and GCP.
  • Knowledge of data warehouses and ETL/ELT processes.

Responsibilities

  • Develop and maintain scalable data systems and architectures.
  • Collect, clean, and transform raw data into usable formats.
  • Collaborate with stakeholders to understand data requirements.

Skills

Python
SQL
Programming
Analytical skills
Problem-solving abilities

Education

Polytechnic Diploma or Bachelor's degree in relevant field

Tools

PostgreSQL
MySQL
MongoDB
AWS
Azure
GCP
Apache Spark
Apache Kafka
Hadoop
Apache Airflow
Job description

Job Title: Data Engineer

Key Responsibilities
  • Develop, construct, test, and maintain scalable data systems, pipelines, and architectures (data lakes, warehouses).
  • Collect raw data from various sources, clean it, and transform it into usable formats (ETL/ELT).
  • Work with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions.
  • Improve existing data frameworks, monitor workflows, and troubleshoot issues.
  • Implement data governance, validation, and security measures.
Qualifications & Skills
Education

Polytechnic Diploma or bachelor’s degree in computer science, Data Analytics, Business Intelligence or any related field (or equivalent experience).

Experience
  • Programming: Strong Python and SQL skills are fundamental, with Java/Scala useful for big data.
  • Databases: Experience with relational (PostgreSQL, MySQL) and NoSQL databases (MongoDB).
  • Cloud Platforms: Proficiency in major cloud providers (AWS, Azure, GCP) for data services (S3, Redshift, EMR, etc.).
  • Big Data: Working knowledge of tools like Apache Spark, Kafka, and Hadoop.
  • ETL/ELT: Designing and implementing data extraction, transformation, and loading processes.
  • Data Warehousing: Concepts and tools for storing large datasets (e.g., Redshift, Snowflake).
  • Orchestration: Tools like Apache Airflow for scheduling and managing workflows.
  • Data Modeling: Designing efficient database and data warehouse schemas.
Technical Skills
  • Programming: Python, Java, SQL (essential).
  • Databases: Relational (SQL) & NoSQL databases.
  • Big Data Tools: Hadoop, Spark (often).
  • Cloud Platforms: AWS, Azure, GCP (increasingly common).
  • ETL/ELT Tools: Expertise in data integration tools.
  • Software Engineering: Strong understanding of data structures and algorithms.
Soft Skills
  • Strong analytical and problem-solving abilities.
  • Good communication skills to collaborate with non-technical teams.
  • Eagerness to learn and adapt in a fast-paced environment
  • Open to learn new software or BI tools in the market.
Nice to Have
  • Experience with BI or Reporting tools (publishing, scheduling, permissions)
  • Knowledge of GitHub, GitLab
  • Knowledge of SDLC & DDLC
  • Understanding of Agile/Scrum methodologies.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.