Back to Search
Start Over
Multisensory perception of vocal and facial expressions of emotions
- Publication Year :
- 2022
-
Abstract
- Emotions have a pivotal role in our lives, and we massively express and perceive them through faces and voices. The present thesis investigates the perception and representation of emotion expressions in various contexts. In the first study, we investigated the performance of neurotypical individuals at discriminating dynamic facial and vocal emotions, with specific attention given to the time component of dynamic expressions, showing how the amount of information needed for a discriminatory decision unfolds faster in vision than audition, but always fastest in a multisensory context. In the second study we investigated the neural correlates of the perception of unimodal and multimodal expressions through functional-MRI. In this study we show how emotion information is represented in a widespread fashion throughout the regions of the face and voice processing networks. We additionally demonstrate how several of these regions not only represent their native modality, but the opposite sensory modality as well, with some doing so in a supramodal fashion, i.e. independently of the sensory modality of the input. In the third study we investigate whether visual perception is necessary for development of emotions’ discrimination through voices by testing early blind and sighted individuals. We were able to show that, although the behavioral profile is similar across the two groups for the investigated emotion categories, blindness affected the performance in specific threat-related vocal emotions.<br />(PSYE - Sciences psychologiques et de l'éducation) -- UCL, 2022
Details
- Database :
- OAIster
- Notes :
- English
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1372935358
- Document Type :
- Electronic Resource