Job Search and Career Advice Platform

Enable job alerts via email!

Data Engineer

Red Alpha

Singapore

On-site

SGD 60,000 - 80,000

Full time

5 days ago
Be an early applicant

Generate a tailored resume in minutes

Land an interview and earn more. Learn more

Job summary

A leading cybersecurity company in Singapore is seeking passionate Data Engineers to design and maintain scalable data pipelines. You will work closely with cross-functional teams, implement data orchestration tools like Apache Airflow, and optimize data storage solutions. Candidates should possess a strong problem-solving mindset and a customer-first attitude. An exciting 11-week, fully-sponsored on-the-job training is provided before full-time deployment for 3 years.

Benefits

11-week fully sponsored on-the-job training with allowance
Long-term employment opportunity

Qualifications

  • Strong problem-solving mindset with attention to detail.
  • Customer-first attitude, designing with real users in mind.
  • Flexibility to adapt tools and approaches.

Responsibilities

  • Design, build, and maintain scalable data pipelines.
  • Take ownership of pipeline health and performance.
  • Collaborate with teams to translate data needs into solutions.
  • Implement data pipeline orchestration using tools like Apache Airflow.
  • Optimize data storage and retrieval across diverse systems.
  • Support analytics and AI projects with well-managed data.

Skills

Python for scripting and data manipulation
Cloud and storage systems: MinIO, AWS S3
Data orchestration tools: Apache Airflow, AWS Step Functions
Familiarity with BI tools like Tableau
Comfortable in both Windows and Linux environments
Basic understanding of networks, OS commands

Tools

Apache Airflow
AWS S3
Postgres
MinIO
Neo4j
Job description

Red Alpha Cybersecurity is looking for aspiring Data Engineers who are passionate about building robust data pipelines and enabling data-driven solutions. You’ll work closely with cross-functional teams, including data scientists, analysts, and software engineers. To deliver high-performance, production-grade data infrastructure.

What You’ll Be Doing

Design, build, and maintain scalable, reliable data pipelines to support diverse business and AI applications

Take ownership of pipeline health. Proactively monitor performance, debug issues, and improve data quality

Collaborate with cross-functional teams to translate data needs into technical solutions

Implement data pipeline orchestration using tools such as Apache Airflow or AWS Step Functions

Optimise data storage and retrieval across RDBMS (Postgres, MSSQL), object stores (MinIO/S3), and graph databases (Neo4j)

Design data schemas (normalized, denormalized, star-schema) and enforce proper access and backup protocols

Support analytics, product development, and AI POCs with clean, well-managed data structures

Apply scripting (Python) and open-source tools to automate data workflows and ETL processes

What You Should Know or Be Eager to Learn

Preferred Technical Skills (not mandatory)

Python for scripting and data manipulation

Cloud and storage systems: MinIO, AWS S3, AWS tech stack (preferred)

Data orchestration tools: Apache Airflow, AWS Step Functions

Familiarity with BI tools like Tableau

Comfort working in both Windows and Linux environments

Basic understanding of networks, OS commands, and system-level data operations

Mindset We Look For

Strong problem-solving mindset with attention to detail

Customer‑first attitude. Designing with real users and business needs in mind

Flexibility to adapt tools and approaches based on the use case

Eagerness to learn and grow in a high‑performance tech environment

All selected candidates will undergo a 11‑week, fully sponsored on‑job‑training with allowance, before being deployed full‑time for 3 years.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.