Back to Search
Start Over
Vision-Based Imitation Learning of Needle Reaching Skill for Robotic Precision Manipulation.
- Source :
- Journal of Intelligent & Robotic Systems; 2021, Vol. 101 Issue 1, p1-16, 16p
- Publication Year :
- 2021
-
Abstract
- In this paper, an imitation learning approach of vision guided reaching skill is proposed for robotic precision manipulation, which enables the robot to adapt its end-effector’s nonlinear motion with the awareness of collision-avoidance. The reaching skill model firstly uses the raw images of objects as inputs, and generates the incremental motion command to guide the lower-level vision-based controller. The needle’s tip is detected in image space and the obstacle region is extracted by image segmentation. A neighborhood-sampling method is designed for needle component collision perception, which includes a neural networks based attention module. The neural network based policy module infers the desired motion in the image space according to the neighborhood-sampling result, goal and current positions of the needle’s tip. A refinement module is developed to further improve the performance of the policy module. In three dimensional (3D) manipulation tasks, typically two cameras are used for image-based vision control. Therefore, considering the epipolar constraint, the relative movements in two cameras’ views are refined by optimization. Experimental are conducted to validate the effectiveness of the proposed methods. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 09210296
- Volume :
- 101
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Journal of Intelligent & Robotic Systems
- Publication Type :
- Academic Journal
- Accession number :
- 147654485
- Full Text :
- https://doi.org/10.1007/s10846-020-01290-1