Enable job alerts via email!

ETL & Data Engineer

SAGL CONSULTING PTE. LTD.

Singapore

On-site

SGD 60,000 - 90,000

Full time

Today
Be an early applicant

Job summary

A data consulting firm in Singapore is seeking an ETL & Data Engineer to design and maintain robust data pipelines for AI-driven operations. The role involves optimizing ETL processes and integrating with AIOps platforms. Ideal candidates should have strong programming skills in Python, Go, and experience with tools like Apache NiFi and Airflow.

Qualifications

  • Experience in designing and maintaining ETL pipelines for processing large data volumes.
  • Proficient in data engineering practices and programming languages listed.
  • Familiar with cloud data storage and management.

Responsibilities

  • Design, build, and maintain data pipelines and backend services.
  • Optimize ETL processes for performance and reliability.
  • Integrate with AIOps and ML platforms.

Skills

Python
Go (Golang)
Java
Ruby
JavaScript/TypeScript
Apache NiFi
Spark
Airflow
Flink
Kafka
Talend
PostgreSQL
MongoDB
Elasticsearch
Snowflake
BigQuery
S3
GCS
Azure Blob
Kafka (Streaming)
Kinesis
Pub/Sub
Job description
Role Overview

ETL & Data Engineer to design, build, and maintain robust data pipelines and backend services that power AI-driven operations. The role involves working with high-volume IT and cloud data, optimizing ETL processes, and integrating with AIOps platforms and ML pipelines.

Key Responsibilities
  • Build and maintain scalable ETL pipelines for batch and real-time data ingestion, transformation, and loading from diverse sources (IT infrastructure, cloud, monitoring systems, APIs).
  • Implement data validation, cleansing, and normalization for consistent AI model input.
  • Develop backend services and APIs to support data ingestion, metadata management, and configuration.
  • Optimize ETL jobs for performance, fault tolerance, and low latency.
  • Integrate with AIOps platforms and ML pipelines using REST APIs or event-driven architectures.
  • Schedule and monitor ETL workflows using tools like Airflow, Prefect, or Dagster.
  • Support CI/CD pipelines for deploying ETL services and full-stack applications.
Required Skills & Tools
  • Programming & Scripting: Python, Go (Golang), Java, Ruby, JavaScript/TypeScript (Next.js)
  • ETL & Data Engineering: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend
  • Orchestration: Airflow, Prefect, Dagster
  • Data Storage & Lakes: PostgreSQL, MongoDB, Elasticsearch, Snowflake, BigQuery, S3, GCS, Azure Blob
  • Streaming Platforms: Kafka, Kinesis, Pub/Sub
Good to Have
  • Experience with AIOps & Observability tools like Splunk, Dynatrace, AppDynamics, New Relic, Elastic Stack
  • Familiarity with ITSM systems (ServiceNow) and CMDB integrations
  • Understanding of metrics, logs, and traces for AI-driven operations
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.