Back to Search Start Over

Affective prosody guides facial emotion processing.

Authors :
Cui, Xin
Jiang, Xiaoming
Ding, Hongwei
Source :
Current Psychology; Sep2023, Vol. 42 Issue 27, p23891-23902, 12p
Publication Year :
2023

Abstract

Previous studies have reported the "emotional congruency effect (ECE)" in cross-modal emotion processing, claiming that multimodal congruent emotional signals will enhance the emotion processing, yet few studies have shown how this effect is dynamically processed over time and whether it is achieved in the same way across language and cultural backgrounds. We adopted the eye-tracking technique to investigate whether and how the audio emotional signal influences the visual processing of emotional faces according to ECE. We explored this issue by asking thirty-two native Mandarin speakers to scan a visual array of four types of emotional faces while listening to the affective prosody matching one of the four emotions. To eliminate the potential confounding from lexico-semantic information, the affective prosody is pronounced in meaningless di-syllable clusters. Results of the experiment indicate that (1) participants paid more attention to happy faces at first glance and their attention shifted to angry and sad faces over time. (2) Consistent with findings in English-speaking settings, ECE appeared in Mandarin-speaking settings, but took effect earlier in happy faces and persisted in all emotions as the unfolding of the signal. Based on the results, we conclude that the processing time differs across emotion types and therefore ECE takes effect in different temporal points according to the emotion type. Finally, we suggest that language and cultural experience may shape the processing time of different emotions. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10461310
Volume :
42
Issue :
27
Database :
Complementary Index
Journal :
Current Psychology
Publication Type :
Academic Journal
Accession number :
171915636
Full Text :
https://doi.org/10.1007/s12144-022-03528-7