Back to Search
Start Over
Deep Learning Method for Grasping Novel Objects Using Dexterous Hands.
- Source :
- IEEE Transactions on Cybernetics; May2022, Vol. 52 Issue 5, p2750-2762, 13p
- Publication Year :
- 2022
-
Abstract
- Robotic grasping ability lags far behind human skills and poses a significant challenge in the robotics research area. According to the grasping part of an object, humans can select the appropriate grasping postures of their fingers. When humans grasp the same part of an object, different poses of the palm will cause them to select different grasping postures. Inspired by these human skills, in this article, we propose new grasping posture prediction networks (GPPNs) with multiple inputs, which acquire information from the object image and the palm pose of the dexterous hand to predict appropriate grasping postures. The GPPNs are further combined with grasping rectangle detection networks (GRDNs) to construct multilevel convolutional neural networks (ML-CNNs). In this study, a force-closure index was designed to analyze the grasping quality, and force-closure grasping postures were generated in the GraspIt! environment. Depth images of objects were captured in the Gazebo environment to construct the dataset for the GPPNs. Herein, we describe simulation experiments conducted in the GraspIt! environment, and present our study of the influences of the image input and the palm pose input on the GPPNs using a variable-controlling approach. In addition, the ML-CNNs were compared with the existing grasp detection methods. The simulation results verify that the ML-CNNs have a high grasping quality. The grasping experiments were implemented on the Shadow hand platform, and the results show that the ML-CNNs can accurately complete grasping of novel objects with good performance. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 21682267
- Volume :
- 52
- Issue :
- 5
- Database :
- Complementary Index
- Journal :
- IEEE Transactions on Cybernetics
- Publication Type :
- Academic Journal
- Accession number :
- 157007074
- Full Text :
- https://doi.org/10.1109/TCYB.2020.3022175