Enable job alerts via email!

Data Engineer

RAPSYS TECHNOLOGIES PTE. LTD.

Singapore

On-site

SGD 85,000 - 110,000

Full time

Today
Be an early applicant

Job summary

A technology firm in Singapore is seeking a Data Engineer to design and optimize ETL/ELT data pipelines and build real-time streaming pipelines. The ideal candidate will have 6-10 years of experience in data engineering, especially in BFSI, and proficiency in tools such as Apache Spark and Azure. This role offers a chance to work on cutting-edge data projects in a collaborative environment.

Qualifications

  • 6 – 10 years of experience in data engineering, with at least 3 years in BFSI.
  • Proven experience building real-time and batch data pipelines.
  • Familiarity with regulatory data models like MAS 610, Basel III.

Responsibilities

  • Design, implement, and optimize ETL/ELT data pipelines.
  • Build and operationalize real-time streaming pipelines.
  • Integrate and transform data across various systems.

Skills

Data engineering
ETL/ELT
Apache Spark
Azure
Kafka
Data quality
Python
DevOps

Education

Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field

Tools

Apache Spark
Databricks
Azure Synapse
Azure DevOps
GitHub Actions
Terraform
Job description
Key Responsibilities
  • Design, implement, and optimize ETL/ELT data pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.
  • Build and operationalize real-time streaming pipelines leveraging Kafka / Confluent / Azure Event Hubs for risk and liquidity data.
  • Integrate and transform data across Core Banking, Trade, Payments, Treasury, CRM, and Compliance systems.
  • Implement data quality, validation, and lineage controls using tools such as Great Expectations / Deequ / dbt tests.
  • Develop and maintain data models and schemas (3NF, Dimensional, Data Vault 2.0).
  • Collaborate with Security and Governance teams to implement data security, masking, encryption, and tokenization in compliance with MAS TRM / PDPA / PCI-DSS.
  • Participate in data platform modernization projects (Teradata / DB2 → Snowflake / Databricks / Synapse).
  • Collaborate with Data Scientists and AI Engineers to deploy ML feature stores and model-serving pipelines.
  • Support regulatory reporting (MAS 610/649) and Basel III/IV data flows.
  • Maintain CI/CD pipelines for data infrastructure using Azure DevOps / Terraform / GitHub Actions.
Experience and Qualifications
  • 6 – 10 years of experience in data engineering, with at least 3 years in BFSI (banking, insurance, or capital markets).
  • Proven experience building real-time and batch data pipelines on Azure or AWS.
  • Exposure to regulatory data models (MAS 610, Basel III, IFRS 9/17, BCBS 239).
  • Familiarity with DevOps and MLOps integration.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • Certifications preferred: Microsoft Azure Data Engineer Associate, Databricks Data Engineer Professional, Snowflake SnowPro Core.
Key Attributes
  • Strong analytical and problem-solving mindset.
  • Ability to work across multi-disciplinary and geographically distributed teams.
  • Excellent written and verbal communication skills.
  • High accountability and ownership for quality and delivery.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.