Social network you want to login/join with:
Job Title: Perception Engineer – Real-Time Manipulation Robotics
Location: Bristol, UK (Hybrid for top talent)
Employment Type: Full-time, Permanent
About the Role
We’re building the next generation of robot manipulation systems for unstructured, real-world environments. As a Perception Engineer, you will design, prototype, and deploy high-performance perception pipelines that enable our robots to see, understand, and dexterously interact with the world-in real time. You’ll sit at the intersection of academia and product, translating state-of-the-art research into production-ready software running on ROS 2 based platforms.
What You’ll Do
- Own the real-time perception stack for robotic manipulation tasks, from sensor acquisition through to fused 3-D scene understanding and grasp/placement proposals.
- Research, prototype, and benchmark novel algorithms in 2-D/3-D vision, multi-modal fusion, and dense correspondence that push manipulation speed and reliability.
- Implement, optimize, and profile deep-learning models in PyTorch and C++ (CUDA) to meet strict latency budgets on embedded GPUs/accelerators.
- Integrate perception modules in ROS 2, ensuring clean interfaces, deterministic scheduling, and robust failure handling.
- Conduct rigorous real-world and simulated experiments, and communicate results through clear technical reports and publications (internal and external).
- Collaborate cross-functionally with controls, planning, and hardware teams to close perception–action loops and ship production-quality releases.
Minimum Qualifications
- PhD (or outstanding Master’s + equivalent publications) in Robotics, Computer Vision, Machine Learning, or a closely related field.
- 3+ years hands-on experience building real-time perception systems for robot manipulation or autonomous platforms.
- Advanced proficiency in Python and modern C++17/20; proven track record writing clean, testable, high-performance code.
- Deep expertise with PyTorch (training & inference) and GPU programming (CUDA, TensorRT, or similar).
- Production experience with ROS 2 (rclcpp/rclpy, lifecycle nodes, DDS tuning, real-time QoS).
- Strong publication record in top-tier venues (e.g., RSS, ICRA, CoRL, CVPR, RAL).
Preferred Qualifications
- Track record shipping perception on manipulation platforms (e.g., mobile manipulators, bin-picking arms, industrial cobots).
- Familiarity with multi-sensor calibration, tactile or force perception, depth cameras (D-ToF, active stereo), and point-cloud processing (PCL, Open3D).
- Experience deploying on-device inference for NVIDIA Jetson/Orin, Intel ARC, or similar edge accelerators.
- Contributions to open-source robotics or vision libraries.
- Comfortable working in an agile, research-driven environment with fast iteration cycles.
What We Offer
- Competitive salary, equity, and performance bonus.
- Comprehensive health, dental, and vision coverage.
- Annual conference budget & dedicated research time.
- Flexible hours and hybrid/remote-friendly culture.
- State-of-the-art lab space with collaborative, cross-disciplinary teams.