Job Search and Career Advice Platform

Enable job alerts via email!

Devops and Data Engineer

SAGL CONSULTING PTE. LTD.

Singapore

On-site

SGD 70,000 - 90,000

Full time

2 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A data consultancy firm based in Singapore is looking for an ETL & Data Engineer to design and maintain robust data pipelines and backend services. The role involves optimizing ETL processes and integrating with advanced AIOps and ML platforms. Ideal candidates should have strong programming skills in languages such as Python and Go, and experience with cloud data solutions and tools like Airflow and Kafka. This position offers a dynamic environment focused on AI-driven operations.

Qualifications

  • Strong experience in ETL and Data Engineering.
  • Proficient in building scalable ETL pipelines for real-time data.
  • Experienced with cloud computing and data storage solutions.

Responsibilities

  • Design, build, and maintain ETL pipelines for batch and real-time ingestion.
  • Optimize ETL processes for performance and fault tolerance.
  • Integrate with AIOps platforms using APIs.

Skills

Python
Go (Golang)
Java
Ruby
JavaScript/TypeScript
Apache NiFi
Spark
Airflow
Flink
Kafka
Talend
PostgreSQL
MongoDB
Elasticsearch
Snowflake
BigQuery
S3
GCS
Azure Blob
Kafka
Kinesis
Pub/Sub
Job description
Role Overview:

ETL & Data Engineer to design, build, and maintain robust data pipelines and backend services that power AI-driven operations. The role involves working with high-volume IT and cloud data, optimizing ETL processes, and integrating with AIOps platforms and ML pipelines.

Key Responsibilities:
  • Build and maintain scalable ETL pipelines for batch and real-time data ingestion, transformation, and loading from diverse sources (IT infrastructure, cloud, monitoring systems, APIs).
  • Implement data validation, cleansing, and normalization for consistent AI model input.
  • Develop backend services and APIs to support data ingestion, metadata management, and configuration.
  • Optimize ETL jobs for performance, fault tolerance, and low latency.
  • Integrate with AIOps platforms and ML pipelines using REST APIs or event-driven architectures.
  • Schedule and monitor ETL workflows using tools like Airflow, Prefect, or Dagster.
  • Support CI/CD pipelines for deploying ETL services and full-stack applications.
Required Skills & Tools:
  • Programming & Scripting: Python, Go (Golang), Java, Ruby, JavaScript/TypeScript (Next.js)
  • ETL & Data Engineering: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend
  • Orchestration: Airflow, Prefect, Dagster
  • Data Storage & Lakes: PostgreSQL, MongoDB, Elasticsearch, Snowflake, BigQuery, S3, GCS, Azure Blob
  • Streaming Platforms: Kafka, Kinesis, Pub/Sub
Good to Have:
  • Experience with AIOps & Observability tools like Splunk, Dynatrace, AppDynamics, New Relic, Elastic Stack
  • Familiarity with ITSM systems (ServiceNow) and CMDB integrations
  • Understanding of metrics, logs, and traces for AI-driven operations
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.