Enable job alerts via email!

Senior Data Enginner

RECRUIT EXPRESS PTE LTD

Singapore

On-site

SGD 60,000 - 90,000

Full time

2 days ago
Be an early applicant

Job summary

A leading recruitment agency in Singapore is seeking a Data Engineer to create and maintain ETL pipelines and backend services. The ideal candidate will be proficient in Python, Go, or Java, with experience in data engineering tools like Apache NiFi and Spark. Responsibilities include building scalable ETL solutions and implementing CI/CD pipelines. Please apply by sending your resume to the contact person mentioned.

Qualifications

  • Proficient in data engineering tools and frameworks.
  • Experience with backend service development.
  • Strong programming skills in Python, Go, or Java.

Responsibilities

  • Build and maintain ETL pipelines for high-volume data.
  • Design and develop backend services using microservices.
  • Implement CI/CD pipelines for deploying applications.

Skills

Python
Go (Golang)
Java
Ruby
JavaScript/TypeScript

Tools

Apache NiFi
Spark
Airflow
PostgreSQL
MongoDB

Job description

Responsibilities

ETL & Data Engineering:

  • Build and maintain robust ETL pipelines to ingest, transform, and load high-volume, high-velocity data from IT infrastructure, monitoring systems, and cloud environments.
  • Develop batch and real-time data flows using frameworks.
  • Optimize ETL jobs for scalability, fault tolerance, and low latency.
  • Implement data validation, cleansing, and normalization processes for consistent AI model input.
  • Integrate with AIOps platforms and ML pipelines using REST APIs or event-driven architectures.
  • Develop and maintain robust data pipelines for ingesting, filter, transforming, and loading data from various sources (e.g., network devices, appliances, databases, APIs, cloud storage

Application & Backend Development:

  • Design and build backend services (microservices, APIs) in Python / Go/Java / Ruby to support data ingestion, metadata services, and configuration management.

DevOps & Orchestration:

  • Use tools like Elastic Stack, Apache Airflow, Prefect, or Dagster to schedule and monitor ETL jobs.
  • Implement CI/CD pipelines for deploying ETL services and full-stack apps.
  • Any other ad-hoc duties as assigned by supervisor.

Requirement

Programming & Scripting:

  • Proficient in one or more of the following:
  • Python – ETL scripting, backend APIs, ML interfacing
  • Ruby – Scripting and legacy tool integration
  • Go (Golang) – High-performance microservices, concurrency handling
  • Java – Enterprise-grade systems, Spark, Kafka pipelines
  • JavaScript/TypeScript with Next.js – Frontend dashboards and admin UIs

Data Engineering Tools:

  • Proficient in one or more of the following:
  • ETL Frameworks: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend
  • Orchestration: Airflow, Prefect, Dagster
  • Data Storage: PostgreSQL, MongoDB, Elasticsearch, Snowflake, BigQuery
  • Streaming Platforms: Kafka, Kinesis, Pub/Sub
  • Data Lake/File Storage: S3, GCS, Azure Blob

AIOps & Observability: Plus

  • Experience integrating with tools like Splunk, Dynatrace, AppDynamics, New Relic, Elastic Stack
  • Familiarity with ITSM systems (e.g., ServiceNow) and CMDB integrations
  • Understanding of metrics/logs/traces and their value in AI-driven operations .

Interested applicants, please email your resume to Andre Chua Jing Ming

Email: andrechua@recruitexpress.com.sg

CEI Reg No: R1989053

EA Licence No: 99C4599

Recruit Express Pte Ltd

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.