Enable job alerts via email!

Data Engineer

Red Alpha Cybersecurity

Singapore

On-site

SGD 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A leading cybersecurity firm in Singapore is seeking a Data Engineer to design and maintain scalable data pipelines. The role involves optimizing data storage and retrieval while ensuring data quality. Candidates should have experience with Python, data orchestration tools, and cloud technologies. Selected candidates will undergo an 11-week sponsored training before full-time deployment.

Benefits

11-week fully sponsored on-the-job training

Qualifications

  • Experience in designing and maintaining scalable data pipelines.
  • Ability to monitor and improve data quality.
  • Knowledge of cloud technologies and data orchestration.
  • Comfort working in diverse operating environments.

Responsibilities

  • Design and maintain data pipelines for business and AI applications.
  • Collaborate with teams to translate data needs into solutions.
  • Optimize data storage and retrieval across various databases.
  • Implement data orchestration and automate workflows.

Skills

Python for scripting and data manipulation
Data orchestration tools: Apache Airflow, AWS Step Functions
Cloud and storage systems: MinIO, AWS S3
Familiarity with BI tools like Tableau
Problem-solving mindset
Customer-first attitude
Job description

Design, build, and maintain scalable, reliable data pipelines to support diverse business and AI applications

Take ownership of pipeline health. Proactively monitor performance, debug issues, and improve data quality

Collaborate with cross-functional teams to translate data needs into technical solutions

Implement data pipeline orchestration using tools such as Apache Airflow or AWS Step Functions

Optimise data storage and retrieval across RDBMS (Postgres, MSSQL), object stores (MinIO/S3), and graph databases (Neo4j)

Design data schemas (normalized, denormalized, star-schema) and enforce proper access and backup protocols

Support analytics, product development, and AI POCs with clean, well-managed data structures

Apply scripting (Python) and open-source tools to automate data workflows and ETL processes

What You Should Know or Be Eager to Learn
  • Python for scripting and data manipulation
  • Cloud and storage systems: MinIO, AWS S3, AWS tech stack (preferred)
  • Data orchestration tools: Apache Airflow, AWS Step Functions
  • Familiarity with BI tools like Tableau
  • Comfort working in both Windows and Linux environments
  • Basic understanding of networks, OS commands, and system-level data operations
Mindset We Look For
  • Strong problem-solving mindset with attention to detail
  • Customer-first attitude. Designing with real users and business needs in mind
  • Flexibility to adapt tools and approaches based on the use case
  • Eagerness to learn and grow in a high-performance tech environment

All selected candidates will undergo a 11-week, fully sponsored on-job-training with allowance, before being deployed full-time for 3 years.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.