Back to Search Start Over

Learning Bidirectional Action-Language Translation with Limited Supervision and Testing with Incongruent Input

Authors :
Ozan Özdemir
Matthias Kerzel
Cornelius Weber
Jae Hee Lee
Muhammad Burhan Hafez
Patrick Bruns
Stefan Wermter
Source :
Applied Artificial Intelligence, Vol 37, Iss 1 (2023)
Publication Year :
2023
Publisher :
Taylor & Francis Group, 2023.

Abstract

Human infant learning happens during exploration of the environment, by interaction with objects, and by listening to and repeating utterances casually, which is analogous to unsupervised learning. Only occasionally, a learning infant would receive a matching verbal description of an action it is committing, which is similar to supervised learning. Such a learning mechanism can be mimicked with deep learning. We model this weakly supervised learning paradigm using our Paired Gated Autoencoders (PGAE) model, which combines an action and a language autoencoder. After observing a performance drop when reducing the proportion of supervised training, we introduce the Paired Transformed Autoencoders (PTAE) model, using Transformer-based crossmodal attention. PTAE achieves significantly higher accuracy in language-to-action and action-to-language translations, particularly in realistic but difficult cases when only few supervised training samples are available. We also test whether the trained model behaves realistically with conflicting multimodal input. In accordance with the concept of incongruence in psychology, conflict deteriorates the model output. Conflicting action input has a more severe impact than conflicting language input, and more conflicting features lead to larger interference. PTAE can be trained on mostly unlabeled data where labeled data is scarce, and it behaves plausibly when tested with incongruent input.

Details

Language :
English
ISSN :
08839514 and 10876545
Volume :
37
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Applied Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
edsdoj.fe16be4a2f8345afa750d580440cc6cd
Document Type :
article
Full Text :
https://doi.org/10.1080/08839514.2023.2179167