Back to Search Start Over

Adapting Skills to Novel Grasps: A Self-Supervised Approach

Authors :
Papagiannis, Georgios
Dreczkowski, Kamil
Vosylius, Vitalis
Johns, Edward
Publication Year :
2024

Abstract

In this paper, we study the problem of adapting manipulation trajectories involving grasped objects (e.g. tools) defined for a single grasp pose to novel grasp poses. A common approach to address this is to define a new trajectory for each possible grasp explicitly, but this is highly inefficient. Instead, we propose a method to adapt such trajectories directly while only requiring a period of self-supervised data collection, during which a camera observes the robot's end-effector moving with the object rigidly grasped. Importantly, our method requires no prior knowledge of the grasped object (such as a 3D CAD model), it can work with RGB images, depth images, or both, and it requires no camera calibration. Through a series of real-world experiments involving 1360 evaluations, we find that self-supervised RGB data consistently outperforms alternatives that rely on depth images including several state-of-the-art pose estimation methods. Compared to the best-performing baseline, our method results in an average of 28.5% higher success rate when adapting manipulation trajectories to novel grasps on several everyday tasks. Videos of the experiments are available on our webpage at https://www.robot-learning.uk/adapting-skills<br />Comment: Accepted at IROS 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.00178
Document Type :
Working Paper