Back to Search Start Over

The Neural Representation of Visually Evoked Emotion Is High-Dimensional, Categorical, and Distributed across Transmodal Brain Regions

Authors :
Horikawa, Tomoyasu
Cowen, Alan S.
Keltner, Dacher
Kamitani, Yukiyasu
Source :
iScience, iScience, Vol 23, Iss 5, Pp-(2020)
Publication Year :
2020
Publisher :
Elsevier BV, 2020.

Abstract

Summary Central to our subjective lives is the experience of different emotions. Recent behavioral work mapping emotional responses to 2,185 videos found that people experience upward of 27 distinct emotions occupying a high-dimensional space, and that emotion categories, more so than affective dimensions (e.g., valence), organize self-reports of subjective experience. Here, we sought to identify the neural substrates of this high-dimensional space of emotional experience using fMRI responses to all 2,185 videos. Our analyses demonstrated that (1) dozens of video-evoked emotions were accurately predicted from fMRI patterns in multiple brain regions with different regional configurations for individual emotions; (2) emotion categories better predicted cortical and subcortical responses than affective dimensions, outperforming visual and semantic covariates in transmodal regions; and (3) emotion-related fMRI responses had a cluster-like organization efficiently characterized by distinct categories. These results support an emerging theory of the high-dimensional emotion space, illuminating its neural foundations distributed across transmodal regions.<br />Graphical Abstract<br />Highlights • Dozens of video-evoked emotions were predicted from fMRI patterns in multiple regions • Categories better predicted cortical and subcortical responses than dimensions • Emotion-related responses had a cluster-like organization characterized by categories • Neural representation of emotion is high-dimensional, categorical, and distributed<br />Neuroscience; Cognitive Neuroscience; Techniques in Neuroscience

Details

ISSN :
25890042
Volume :
23
Database :
OpenAIRE
Journal :
iScience
Accession number :
edsair.doi.dedup.....fbc065cebc719669c3a89eba48639cbd