Enable job alerts via email!

Senior DevOps Engineer

RECRUIT EXPRESS PTE LTD

Pasir Panjang

On-site

MYR 60,000 - 90,000

Full time

Yesterday
Be an early applicant

Job summary

A leading recruitment agency in Negeri Sembilan is seeking a skilled Data Engineer to build ETL pipelines and backend services using languages like Python and Go. Responsibilities include ensuring high-performance data processing and implementing DevOps practices. Ideal candidates should have experience with various data engineering tools and programming skills. Interested applicants should submit their resumes to the provided email.

Qualifications

  • Proficient in programming languages like Python, Ruby, Go, Java, or JavaScript/TypeScript.
  • Experience with data storage and ETL frameworks.
  • Familiarity with orchestration tools like Apache Airflow or Prefect.

Responsibilities

  • Build and maintain ETL pipelines to process high-volume data.
  • Develop backend services in various programming languages.
  • Implement CI/CD pipelines for deploying ETL services.

Skills

Python
ETL scripting
Ruby
Go (Golang)
Java
JavaScript/TypeScript

Tools

Apache NiFi
Spark
Airflow
Flink
Kafka
PostgreSQL

Job description

Key Responsibilities:

ETL & Data Engineering:

  • Build and maintain robust ETL pipelines to ingest, transform, and load high-volume, high-velocity data from IT infrastructure, monitoring systems, and cloud environments.
  • Develop batch and real-time data flows using frameworks.
  • Optimize ETL jobs for scalability, fault tolerance, and low latency.
  • Implement data validation, cleansing, and normalization processes for consistent AI model input.
  • Integrate with AIOps platforms and ML pipelines using REST APIs or event-driven architectures.
  • Develop and maintain robust data pipelines for ingesting, filter, transforming, and loading data from various sources (e.g., network devices, appliances, databases, APIs, cloud storage)

Application & Backend Development:

  • Design and build backend services (microservices, APIs) in Python / Go/Java / Ruby to support data ingestion, metadata services, and configuration management.

DevOps & Orchestration:

  • Use tools like Elastic Stack, Apache Airflow, Prefect, or Dagster to schedule and monitor ETL jobs.
  • Implement CI/CD pipelines for deploying ETL services and full-stack apps.

Required Skillset:

Programming & Scripting:

  • Proficient in one or more of the following:
    • Python – ETL scripting, backend APIs, ML interfacing
    • Ruby – Scripting and legacy tool integration
    • Go (Golang) – High-performance microservices, concurrency handling
    • Java – Enterprise-grade systems, Spark, Kafka pipelines
    • JavaScript/TypeScript with Next.js – Frontend dashboards and admin UIs

Data Engineering Tools:

  • Proficient in one or more of the following:
    • ETL Frameworks: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend
    • Orchestration: Airflow, Prefect, Dagster
    • Data Storage: PostgreSQL, MongoDB, Elasticsearch, Snowflake, BigQuery
    • Streaming Platforms: Kafka, Kinesis, Pub/Sub
    • Data Lake/File Storage: S3, GCS, Azure Blob

Good to have:

AIOps & Observability:

  • Experience integrating with tools like Splunk, Dynatrace, AppDynamics, New Relic, Elastic Stack
  • Familiarity with ITSM systems (e.g., ServiceNow) and CMDB integrations
  • Understanding of metrics/logs/traces and their value in AI-driven operations

Interested applicants please send your resume to venessagoh@recruitexpress.com.sg

Venessa Goh Wee Ni

R24124686

Recruit Express Pte Ltd

EA License No: 99C4599

RCB No.: 199601303W

We regret that only shortlisted candidates will be contacted.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.