Back to Search Start Over

Smiling in the Face and Voice of Avatars and Robots: Evidence for a ‘Smiling McGurk Effect’.

Authors :
Torre, Ilaria
Holk, Simon
Yadollahi, Elmira
Leite, Iolanda
McDonnell, Rachel
Harte, Naomi
Source :
IEEE Transactions on Affective Computing; Apr-Jun2024, Vol. 15 Issue 2, p393-404, 12p
Publication Year :
2024

Abstract

Multisensory integration influences emotional perception, as the McGurk effect demonstrates for the communication between humans. Human physiology implicitly links the production of visual features with other modes like the audio channel: Face muscles responsible for a smiling face also stretch the vocal cords that result in a characteristic smiling voice. For artificial agents capable of multimodal expression, this linkage is modeled explicitly. In our studies, we observe the influence of visual and audio channels on the perception of the agents’ emotional expression. We created videos of virtual characters and social robots either with matching or mismatching emotional expressions in the audio and visual channels. In two online studies, we measured the agents’ perceived valence and arousal. Our results consistently lend support to the ‘emotional McGurk effect’ hypothesis, according to which face transmits valence information, and voice transmits arousal. When dealing with dynamic virtual characters, visual information is enough to convey both valence and arousal, and thus audio expressivity need not be congruent. When dealing with robots with fixed facial expressions, however, both visual and audio information need to be present to convey the intended expression. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19493045
Volume :
15
Issue :
2
Database :
Complementary Index
Journal :
IEEE Transactions on Affective Computing
Publication Type :
Academic Journal
Accession number :
177606920
Full Text :
https://doi.org/10.1109/TAFFC.2022.3213269