Enable job alerts via email!

Research Assistant (Visual-Language Manipulation)

National University of Singapore

Singapore

On-site

SGD 60,000 - 80,000

Full time

Today
Be an early applicant

Job summary

A leading educational institution in Singapore seeks a candidate for a robotics project focused on efficient multimodal robot learning for manipulation. Responsibilities include contributing to manipulation pipelines and implementing safety-aware modules. Ideal candidates have strong programming skills in Python and a background in robot manipulation and simulation tools such as ROS, Isaac Gym, or Gazebo.

Qualifications

  • Experience in robot data collection and calibration.
  • Interest in multimodal learning in robotics.
  • Familiarity with large-scale manipulation datasets.

Responsibilities

  • Build manipulation pipelines combining perception, language, and control.
  • Implement safety and uncertainty-aware modules.
  • Develop and maintain simulation environments for training and testing.
  • Analyze results and document findings for dissemination.

Skills

Strong programming skills in Python
Experience with ROS / ROS2
Knowledge of robotics simulation tools
Background in robot manipulation
Familiarity with vision-language models
Analytical and troubleshooting skills
Ability to work independently and collaboratively

Tools

Python
C++
Isaac Lab / Isaac Gym / PyBullet / Gazebo
ROS / ROS2
Job description

Interested applicants are invited to apply directly at the NUS Career Portal

Your application will be processed only if you apply via NUS Career Portal

We regret that only shortlisted candidates will be notified.

Job Description

This position involves working on a project focused on efficient multimodal robot learning for manipulation, with emphasis on vision-language-action (VLA) systems. The candidate will help in bridging simulation and real robot systems to enable robust, safe manipulation in real environments.

The candidate will:

  • Contribute to building manipulation pipelines that combine perception, language, and control.
  • Implement and evaluate safety and uncertainty-aware modules to monitor and filter robot behaviors.
  • Perform data collection, calibration, and annotation on robotic manipulators and mobile manipulation platforms (such as Mobile ALOHA).
  • Develop and maintain simulation environments in Isaac Lab / Isaac Gym / PyBullet / Gazebo for training and testing.
  • Work with large manipulation datasets (e.g. LIBERO, RoboCasa, DROID) to guide model training, generalization, and benchmarking.
  • Collaborate with the PI and research team to design experiments, analyze results, document findings, and support dissemination (e.g. internal reports, code releases).
Qualifications
  • Strong programming skills in Python (experience in C++ is a plus).
  • Experience with ROS / ROS2, and robotics simulation tools (e.g. Isaac Lab / Isaac Gym / PyBullet / Gazebo).
  • Background in robot manipulation, motion control, and trajectory planning.
  • Familiarity with vision-language models / architectures (VLMs/VLAs) or multimodal learning in robotics.
  • Experience or strong interest in robot data collection, teleoperation, calibration, and evaluation.
  • Exposure to large-scale manipulation datasets such as LIBERO, RoboCasa, DROID, or similar.
  • Preferred: experience with Mobile ALOHA or mobile manipulation platforms.
  • Good analytical, troubleshooting, and experimental design skills.
  • Ability to work independently as well as collaboratively within a research team.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.