1. Gaze detection as a social cue to initiate natural human-robot collaboration in an assembly task
- Author
-
Matteo Lavit Nicora, Pooja Prajod, Marta Mondellini, Giovanni Tauro, Rocco Vertechy, Elisabeth André, and Matteo Malosio
- Subjects
human-robot interaction ,industry 5.0 ,gaze estimation ,natural behavior ,human-centered computing ,Mechanical engineering and machinery ,TJ1-1570 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Introduction: In this work we explore a potential approach to improve human-robot collaboration experience by adapting cobot behavior based on natural cues from the operator.Methods: Inspired by the literature on human-human interactions, we conducted a wizard-of-oz study to examine whether a gaze towards the cobot can serve as a trigger for initiating joint activities in collaborative sessions. In this study, 37 participants engaged in an assembly task while their gaze behavior was analyzed. We employed a gaze-based attention recognition model to identify when the participants look at the cobot.Results: Our results indicate that in most cases (83.74%), the joint activity is preceded by a gaze towards the cobot. Furthermore, during the entire assembly cycle, the participants tend to look at the cobot mostly around the time of the joint activity. Given the above results, a fully integrated system triggering joint action only when the gaze is directed towards the cobot was piloted with 10 volunteers, of which one characterized by high-functioning Autism Spectrum Disorder. Even though they had never interacted with the robot and did not know about the gaze-based triggering system, most of them successfully collaborated with the cobot and reported a smooth and natural interaction experience.Discussion: To the best of our knowledge, this is the first study to analyze the natural gaze behavior of participants working on a joint activity with a robot during a collaborative assembly task and to attempt the full integration of an automated gaze-based triggering system.
- Published
- 2024
- Full Text
- View/download PDF