Back to Search
Start Over
SURRL: Structural Unsupervised Representations for Robot Learning
- Source :
- IEEE Transactions on Cognitive and Developmental Systems; 2023, Vol. 15 Issue: 2 p819-831, 13p
- Publication Year :
- 2023
-
Abstract
- Revolutionary advances have occurred in robot learning research, where a resurgence in reinforcement learning (RL) algorithms has fueled breakthroughs in acquiring complicated robotic skills without human intervention. Unfortunately, one limitation that has hampered RL methods is that RL may be quite inefficient, that is, it may cost unrealistic learning time and prohibitively large numbers of trajectories to provide implausible models for achieving multitask learning. In a broader perspective, the realization of robotics control by RL relies heavily on the availability of compact and expressive representations of the state spaces. In this article, we propose to learn vivid general structural representations by utilizing structural prior knowledge of robots. Particularly, a novel framework called structural unsupervised representations for robot learning (SURRL) is presented to enable multitask learning. The task-agnostic sample trajectories are leveraged to learn the structural representations that are constructed by graph autoencoder (GAE) in an unsupervised fashion. When learning a new task, the learned structural representations can be directly adopted for subsequent policy learning without training from scratch. Extensive experiments on continuous robotic environments, including dexterous manipulation tasks, characterize our method’s effectiveness to learn optimal policies for handling multiple tasks learning, demonstrating significant improvements over other competitive methods.
Details
- Language :
- English
- ISSN :
- 23798920
- Volume :
- 15
- Issue :
- 2
- Database :
- Supplemental Index
- Journal :
- IEEE Transactions on Cognitive and Developmental Systems
- Publication Type :
- Periodical
- Accession number :
- ejs63271269
- Full Text :
- https://doi.org/10.1109/TCDS.2022.3187186