Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

HORIZON GLOBAL SERVICES PTE. LTD.

Singapore

On-site

SGD 70,000 - 100,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading global services company in Singapore is seeking a skilled Data Engineer to design and implement end-to-end data pipelines. Candidates should have strong experience with Cloud platforms and robust proficiency in Python, SQL, and data modeling techniques. This role involves optimizing storage processes, implementing data governance, and collaborating with diverse teams. A Bachelor’s or Master’s degree in Computer Science or a related field is essential for applicants. Competitive compensation and growth opportunities are offered.

Qualifications

  • Advanced proficiency in Python, SQL, and Scala/Java.
  • Strong experience with Apache Spark, Hadoop, Kafka, and Flink.
  • Expertise in ETL/ELT frameworks and data modeling techniques.
  • Hands-on experience with cloud platforms (AWS, Azure, or GCP).
  • Experience with Snowflake, BigQuery, Redshift, or Databricks.
  • Deep understanding of distributed systems and parallel computing.
  • Strong knowledge of CI/CD pipelines and DevOps practices.
  • Experience with Docker and Kubernetes.
  • Knowledge of data security, encryption, and compliance standards.

Responsibilities

  • Design and implement end-to-end data pipelines for batch and real-time processing.
  • Architect and maintain data warehouses, data lakes, and lakehouse solutions.
  • Optimize data ingestion, transformation, and storage processes for performance and scalability.
  • Manage large-scale structured and unstructured data across distributed systems.
  • Build fault-tolerant, highly available data systems.
  • Implement data governance, security, and access control policies.
  • Collaborate with data scientists, analysts, and product teams to enable advanced analytics and ML workloads.
  • Monitor, debug, and resolve data pipeline failures and latency issues.
  • Automate data workflows using modern orchestration tools.

Skills

Python
SQL
Scala
Java
Apache Spark
Hadoop
Kafka
Flink
ETL/ELT frameworks
data modeling techniques
cloud platforms
Snowflake
BigQuery
Redshift
Databricks
distributed systems
parallel computing
CI/CD pipelines
DevOps practices
Docker
Kubernetes
data security
encryption standards
compliance standards

Education

Bachelor’s or Master’s degree in Computer Science
Engineering or related field
Job description
Key Responsibilities
  • Design and implement end-to-end data pipelines for batch and real-time processing.

  • Architect and maintain data warehouses, data lakes, and lakehouse solutions.

  • Optimize data ingestion, transformation, and storage processes for performance and scalability.

  • Manage large-scale structured and unstructured data across distributed systems.

  • Build fault‑tolerant, highly available data systems.

  • Implement data governance, security, and access control policies.

  • Collaborate with data scientists, analysts, and product teams to enable advanced analytics and ML workloads.

  • Monitor, debug, and resolve data pipeline failures and latency issues.

  • Automate data workflows using modern orchestration tools.

Required Technical Skills
  • Advanced proficiency in Python, SQL, and Scala/Java

  • Strong experience with Apache Spark, Hadoop, Kafka, and Flink

  • Expertise in ETL/ELT frameworks and data modeling techniques

  • Hands‑on experience with cloud platforms (AWS, Azure, or GCP)

  • Experience with Snowflake, BigQuery, Redshift, or Databricks

  • Deep understanding of distributed systems and parallel computing

  • Strong knowledge of CI/CD pipelines and DevOps practices

  • Experience with Docker and Kubernetes

  • Knowledge of data security, encryption, and compliance standards

Preferred Skills
  • Experience with machine learning data pipelines

  • Knowledge of GraphQL and RESTful APIs

  • Exposure to data lineage and metadata management tools

  • Experience with real‑time streaming architectures

Education
  • Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.