1. A topography-based predictive framework for naturalistic viewing fMRI.
- Author
-
Li X, Friedrich P, Patil KR, Eickhoff SB, and Weis S
- Subjects
- Emotions physiology, Brain Mapping methods, Cognition, Magnetic Resonance Imaging methods, Brain diagnostic imaging, Brain physiology
- Abstract
Functional magnetic resonance imaging (fMRI) during naturalistic viewing (NV) provides exciting opportunities for studying brain functions in more ecologically valid settings. Understanding individual differences in brain functions during NV and their behavioural relevance has recently become an important goal. However, methods specifically designed for this purpose remain limited. Here, we propose a topography-based predictive framework (TOPF) to fill this methodological gap. TOPF identifies individual-specific evoked activity topographies in a data-driven manner and examines their behavioural relevance using a machine learning-based predictive framework. We validate TOPF on both NV and task-based fMRI data from multiple conditions. Our results show that TOPF effectively and stably captures individual differences in evoked brain activity and successfully predicts phenotypes across cognition, emotion and personality on unseen subjects from their activity topographies. Moreover, TOPF compares favourably with functional connectivity-based approaches in prediction performance, with the identified predictive brain regions being neurobiologically interpretable. Crucially, we highlight the importance of examining individual evoked brain activity topographies in advancing our understanding of the brain-behaviour relationship. We believe that the TOPF approach provides a simple but powerful tool for understanding brain-behaviour relationships on an individual level with a strong potential for clinical applications., Competing Interests: Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2023. Published by Elsevier Inc.)
- Published
- 2023
- Full Text
- View/download PDF