Back to Search Start Over

Learning Any-View 6DoF Robotic Grasping in Cluttered Scenes via Neural Surface Rendering

Authors :
Jauhri, Snehal
Lunawat, Ishikaa
Chalvatzaki, Georgia
Publication Year :
2023

Abstract

A significant challenge for real-world robotic manipulation is the effective 6DoF grasping of objects in cluttered scenes from any single viewpoint without the need for additional scene exploration. This work reinterprets grasping as rendering and introduces NeuGraspNet, a novel method for 6DoF grasp detection that leverages advances in neural volumetric representations and surface rendering. It encodes the interaction between a robot's end-effector and an object's surface by jointly learning to render the local object surface and learning grasping functions in a shared feature space. The approach uses global (scene-level) features for grasp generation and local (grasp-level) neural surface features for grasp evaluation. This enables effective, fully implicit 6DoF grasp quality prediction, even in partially observed scenes. NeuGraspNet operates on random viewpoints, common in mobile manipulation scenarios, and outperforms existing implicit and semi-implicit grasping methods. The real-world applicability of the method has been demonstrated with a mobile manipulator robot, grasping in open, cluttered spaces. Project website at https://sites.google.com/view/neugraspnet<br />Comment: Accpeted at Robotics: Science and Systems 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2306.07392
Document Type :
Working Paper