Enable job alerts via email!

Data Engineer

KNOWLEDGESG GLOBAL PTE. LTD.

Singapore

On-site

SGD 80,000 - 110,000

Full time

24 days ago

Job summary

A leading data solutions company in Singapore is looking for a Data Engineer to design and maintain ETL pipelines and optimize data architectures. The ideal candidate will have 5–7 years of experience, strong SQL skills, and expertise in cloud data services. If you possess a Bachelor's or Master's degree in Computer Science or IT and are keen to work in a dynamic team, this role offers a great opportunity in the data engineering field.

Qualifications

  • 5–7 years of hands-on experience in Data Engineering or a related role.
  • Strong proficiency in SQL and experience with both relational and non-relational databases.
  • Expertise in data pipeline tools such as Apache Airflow, Kafka, Spark.

Responsibilities

  • Design, develop, and maintain ETL pipelines and data architectures.
  • Integrate data from multiple sources into centralized data warehouses or data lakes.
  • Collaborate with cross-functional teams to deliver data solutions.

Skills

SQL
Data pipeline tools (Apache Airflow, Kafka, etc.)
Programming skills (Python, Java, Scala)
Data modeling
Big data frameworks (Hadoop, Spark, Hive)
Analytical skills

Education

Bachelor’s or Master’s Degree in Computer Science or IT

Tools

PostgreSQL
MySQL
MongoDB
Cassandra
AWS
Azure
GCP
Job description
Key Responsibilities:
  • Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines and data architectures.
  • Integrate data from multiple sources into centralized data warehouses or data lakes.
  • Optimize data storage, transformation, and retrieval processes for performance and scalability.
  • Collaborate with cross-functional teams (Data Science, Analytics, BI, and Engineering) to deliver data solutions.
  • Implement data governance, quality, and security best practices.
  • Automate data workflows and monitor data pipeline performance and reliability.
  • Develop and maintain data models, schemas, and metadata documentation.
  • Work with cloud data services (AWS, Azure, or GCP) to manage and deploy data solutions.
  • Troubleshoot data issues, perform root cause analysis, and ensure high data integrity.
  • Continuously evaluate new technologies to improve data engineering processes.
Required Skills & Qualifications:
  • Bachelor’s or Master’s Degree in Computer Science, Information Technology, or related field.
  • 5–7 years of hands-on experience in Data Engineering or a related role.
  • Strong proficiency in SQL and experience with relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra).
  • Expertise in data pipeline tools such as Apache Airflow, Kafka, Spark, NiFi, or Talend.
  • Strong programming skills in Python, Java, or Scala for data manipulation and automation.
  • Experience with big data frameworks (Hadoop, Spark, Hive, HBase).
  • Proven experience in cloud platforms (AWS Glue, Redshift, S3, Azure Data Factory, GCP BigQuery, etc.).
  • Hands-on experience with data modeling, ETL design, and data warehouse architecture.
  • Familiarity with DevOps, CI/CD pipelines, and containerization (Docker, Kubernetes) is a plus.
  • Strong analytical and problem-solving skills with attention to detail.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.