Enable job alerts via email!

Senior Data Engineer

OPENSOURCE PTE. LTD.

Singapore

On-site

SGD 80,000 - 120,000

Full time

Today
Be an early applicant

Job summary

A leading data engineering firm in Singapore seeks a Senior Data Engineer to design and maintain large-scale data pipelines for Financial Services. Candidates should have 6-10 years of experience, especially in BFSI, with proven skills in Azure and data pipeline technologies. This role demands collaboration across teams with a focus on data quality and security.

Qualifications

  • 6 – 10 years of experience in data engineering, with at least 3 years in BFSI.
  • Proven experience building real-time and batch data pipelines on Azure or AWS.
  • Familiarity with DevOps and MLOps integration.

Responsibilities

  • Design, implement, and optimize ETL/ELT data pipelines.
  • Build and operationalize real-time streaming pipelines.
  • Collaborate with Data Scientists and AI Engineers to deploy ML feature stores.

Skills

Python
PySpark
SQL
Scala
Azure Data Lake
Databricks
Snowflake
Apache Airflow
Kafka
Terraform

Education

Bachelor’s or Master’s degree in Computer Science

Tools

Azure Data Factory
GitHub Actions
Apache Atlas
Azure Purview
Job description
Role Overview

The Senior Data Engineer is responsible for designing, building, and maintaining large-scale, secure, and high-performance data pipelines supporting critical Financial Services workloads.

The role focuses on data modernization, regulatory data aggregation, and AI/ML enablement across domains such as Core Banking, Payments, Risk, Treasury, and Regulatory Reporting.

2. Key Responsibilities
  • Design, implement, and optimize ETL/ELT data pipelines using Apache Spark, PySpark, Databricks, or Azure Synapse.
  • Build and operationalize real-time streaming pipelines leveraging Kafka / Confluent / Azure Event Hubs for risk and liquidity data.
  • Integrate and transform data across Core Banking, Trade, Payments, Treasury, CRM, and Compliance systems.
  • Implement data quality, validation, and lineage controls using tools such as Great Expectations / Deequ / dbt tests.
  • Develop and maintain data models and schemas (3NF, Dimensional, Data Vault 2.0).
  • Collaborate with Security and Governance teams to implement data security, masking, encryption, and tokenization in compliance with MAS TRM / PDPA / PCI-DSS.
  • Participate in data platform modernization projects (Teradata / DB2 → Snowflake / Databricks / Synapse).
  • Collaborate with Data Scientists and AI Engineers to deploy ML feature stores and model-serving pipelines.
  • Support regulatory reporting (MAS 610/649) and Basel III/IV data flows.
  • Maintain CI/CD pipelines for data infrastructure using Azure DevOps / Terraform / GitHub Actions.
3. Required Technical Skills
  • Category Tools / Technologies
  • Languages Python, PySpark, SQL, Scala
  • Data Platforms Azure Data Lake, Synapse, Databricks, Snowflake
  • Orchestration Apache Airflow, Azure Data Factory, dbt
  • Streaming Kafka, Confluent, Event Hubs
  • Governance Apache Atlas, Azure Purview, Collibra
  • Security Encryption, RBAC, Tokenization, Audit Logging
  • CI/CD & IaC Terraform, Azure DevOps, GitHub Actions
4. Experience and Qualifications
  • 6 – 10 years of experience in data engineering, with at least 3 years in BFSI (banking, insurance, or capital markets).
  • Proven experience building real-time and batch data pipelines on Azure or AWS.
  • Exposure to regulatory data models (MAS 610, Basel III, IFRS 9/17, BCBS 239).
  • Familiarity with DevOps and MLOps integration.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • Certifications preferred: Microsoft Azure Data Engineer Associate, Databricks Data Engineer Professional, Snowflake SnowPro Core.
5. Key Attributes
  • Strong analytical and problem-solving mindset.
  • Ability to work across multi-disciplinary and geographically distributed teams.
  • Excellent written and verbal communication skills.
  • High accountability and ownership for quality and delivery.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.