Back to Search Start Over

Learning Flexible Translation between Robot Actions and Language Descriptions

Authors :
Özdemir, Ozan
Kerzel, Matthias
Weber, Cornelius
Lee, Jae Hee
Wermter, Stefan
Source :
Proceedings of the 31st International Conference on Artificial Neural Networks (ICANN 2022)
Publication Year :
2022

Abstract

Handling various robot action-language translation tasks flexibly is an essential requirement for natural interaction between a robot and a human. Previous approaches require change in the configuration of the model architecture per task during inference, which undermines the premise of multi-task learning. In this work, we propose the paired gated autoencoders (PGAE) for flexible translation between robot actions and language descriptions in a tabletop object manipulation scenario. We train our model in an end-to-end fashion by pairing each action with appropriate descriptions that contain a signal informing about the translation direction. During inference, our model can flexibly translate from action to language and vice versa according to the given language signal. Moreover, with the option to use a pretrained language model as the language encoder, our model has the potential to recognise unseen natural language input. Another capability of our model is that it can recognise and imitate actions of another agent by utilising robot demonstrations. The experiment results highlight the flexible bidirectional translation capabilities of our approach alongside with the ability to generalise to the actions of the opposite-sitting agent.<br />Comment: Accepted at the 31st International Conference on Artificial Neural Networks (ICANN 2022)

Details

Database :
arXiv
Journal :
Proceedings of the 31st International Conference on Artificial Neural Networks (ICANN 2022)
Publication Type :
Report
Accession number :
edsarx.2207.07437
Document Type :
Working Paper
Full Text :
https://doi.org/10.1007/978-3-031-15931-2_21