Back to Search Start Over

Face Recognition, Musical Appraisal, and Emotional Crossmodal Bias

Authors :
Sara Invitto
Antonio Calcagnì
Arianna Mignozzi
Rosanna Scardino
Giulia Piraino
Daniele Turchi
Irio De Feudis
Antonio Brunetti
Vitoantonio Bevilacqua
Marina de Tommaso
Source :
Frontiers in Behavioral Neuroscience, Vol 11 (2017)
Publication Year :
2017
Publisher :
Frontiers Media S.A., 2017.

Abstract

Recent research on the crossmodal integration of visual and auditory perception suggests that evaluations of emotional information in one sensory modality may tend toward the emotional value generated in another sensory modality. This implies that the emotions elicited by musical stimuli can influence the perception of emotional stimuli presented in other sensory modalities, through a top-down process. The aim of this work was to investigate how crossmodal perceptual processing influences emotional face recognition and how potential modulation of this processing induced by music could be influenced by the subject's musical competence. We investigated how emotional face recognition processing could be modulated by listening to music and how this modulation varies according to the subjective emotional salience of the music and the listener's musical competence. The sample consisted of 24 participants: 12 professional musicians and 12 university students (non-musicians). Participants performed an emotional go/no-go task whilst listening to music by Albeniz, Chopin, or Mozart. The target stimuli were emotionally neutral facial expressions. We examined the N170 Event-Related Potential (ERP) and behavioral responses (i.e., motor reaction time to target recognition and musical emotional judgment). A linear mixed-effects model and a decision-tree learning technique were applied to N170 amplitudes and latencies. The main findings of the study were that musicians' behavioral responses and N170 is more affected by the emotional value of music administered in the emotional go/no-go task and this bias is also apparent in responses to the non-target emotional face. This suggests that emotional information, coming from multiple sensory channels, activates a crossmodal integration process that depends upon the stimuli emotional salience and the listener's appraisal.

Details

Language :
English
ISSN :
16625153
Volume :
11
Database :
Directory of Open Access Journals
Journal :
Frontiers in Behavioral Neuroscience
Publication Type :
Academic Journal
Accession number :
edsdoj.4274e6684743427ba7c4927c7c63ee52
Document Type :
article
Full Text :
https://doi.org/10.3389/fnbeh.2017.00144