Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer: Real-Time Data Pipelines & Analytics

Thales

Singapore

Hybrid

SGD 70,000 - 90,000

Full time

11 days ago

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A global technology leader located in Singapore seeks skilled Data Engineers to design and optimize data pipelines for next-generation Data Warehouse and Data Lakehouse solutions. Candidates need expertise in ETL/ELT processes using tools like Apache Kafka and Spark, and must ensure data quality and regulatory compliance. This position encourages collaboration in a diverse environment while driving technical excellence. A Bachelor's in Computer Science or a related field is essential, with a Master's preferred.

Benefits

Diverse and inclusive work environment
Opportunities for professional growth and learning

Qualifications

  • Proficient in data processing algorithm selection balancing latency and throughput.
  • Experience implementing ETL & ELT data pipelines with structured or unstructured data.
  • Familiar with cloud-native deployment strategies to cloud providers.

Responsibilities

  • Integrate data from various sources like APIs and databases.
  • Architect and maintain scalable ETL/ELT pipelines for data transformation.
  • Ensure compliance with cybersecurity and regulatory requirements.

Skills

Data processing algorithm selection
ETL & ELT data pipelines
Apache Kafka
Apache Spark 3.0
Kubernetes
PostgreSQL
Java 8+
Git-based protocols
Linux command line
OpenTelemetry integration
Good communication skills in English

Education

Bachelor's in Computer Science or Information Technology
Master's degree in Computer Science or Data Science

Tools

Grafana
Prometheus
ElasticSearch
Kibana
Spark Structured Stream
Flink DataStream
Job description
A global technology leader located in Singapore seeks skilled Data Engineers to design and optimize data pipelines for next-generation Data Warehouse and Data Lakehouse solutions. Candidates need expertise in ETL/ELT processes using tools like Apache Kafka and Spark, and must ensure data quality and regulatory compliance. This position encourages collaboration in a diverse environment while driving technical excellence. A Bachelor's in Computer Science or a related field is essential, with a Master's preferred.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.