Back to Search
Start Over
Gaze-Based Intention Estimation for Shared Autonomy in Pick-and-Place Tasks.
Gaze-Based Intention Estimation for Shared Autonomy in Pick-and-Place Tasks.
- Source :
-
Frontiers in neurorobotics [Front Neurorobot] 2021 Apr 16; Vol. 15, pp. 647930. Date of Electronic Publication: 2021 Apr 16 (Print Publication: 2021). - Publication Year :
- 2021
-
Abstract
- Shared autonomy aims at combining robotic and human control in the execution of remote, teleoperated tasks. This cooperative interaction cannot be brought about without the robot first recognizing the current human intention in a fast and reliable way so that a suitable assisting plan can be quickly instantiated and executed. Eye movements have long been known to be highly predictive of the cognitive agenda unfolding during manual tasks and constitute, hence, the earliest and most reliable behavioral cues for intention estimation. In this study, we present an experiment aimed at analyzing human behavior in simple teleoperated pick-and-place tasks in a simulated scenario and at devising a suitable model for early estimation of the current proximal intention. We show that scan paths are, as expected, heavily shaped by the current intention and that two types of Gaussian Hidden Markov Models, one more scene-specific and one more action-specific, achieve a very good prediction performance, while also generalizing to new users and spatial arrangements. We finally discuss how behavioral and model results suggest that eye movements reflect to some extent the invariance and generality of higher-level planning across object configurations, which can be leveraged by cooperative robotic systems.<br />Competing Interests: When conducting the study both authors were employed by the Honda Research Institute Europe GmbH. During the writing of the manuscript, SF moved to Siemens AG.<br /> (Copyright © 2021 Fuchs and Belardinelli.)
Details
- Language :
- English
- ISSN :
- 1662-5218
- Volume :
- 15
- Database :
- MEDLINE
- Journal :
- Frontiers in neurorobotics
- Publication Type :
- Academic Journal
- Accession number :
- 33935675
- Full Text :
- https://doi.org/10.3389/fnbot.2021.647930