1. Individual cognitive traits can be predicted from task-based dynamic functional connectivity with a deep convolutional-recurrent model.
- Author
-
Almeida de Souza E, Vieira BH, and Salmon CEG
- Subjects
- Humans, Male, Female, Adult, Young Adult, Language, Magnetic Resonance Imaging methods, Connectome methods, Cognition physiology, Brain physiology, Brain diagnostic imaging, Intelligence physiology, Memory, Short-Term physiology, Deep Learning
- Abstract
There has been increased interest in understanding the neural substrates of intelligence and several human traits from neuroimaging data. Deep learning can be used to predict different cognitive measures, such as general and fluid intelligence, from different functional magnetic resonance imaging experiments providing information about the main brain areas involved in these predictions. Using neuroimaging and behavioral data from 874 subjects provided by the Human Connectome Project, we predicted various cognitive scores using dynamic functional connectivity derived from language and working memory functional magnetic resonance imaging task states, using a 360-region multimodal atlas. The deep model joins multiscale convolutional and long short-term memory layers and was trained under a 10-fold stratified cross-validation. We removed the confounding effects of gender, age, total brain volume, motion and the multiband reconstruction algorithm using multiple linear regression. We can explain 17.1% and 16% of general intelligence variance for working memory and language tasks, respectively. We showed that task-based dynamic functional connectivity has more predictive power than resting-state dynamic functional connectivity when compared to the literature and that removing confounders significantly reduces the prediction performance. No specific cortical network showed significant relevance in the prediction of general and fluid intelligence, suggesting a spatial homogeneous distribution of the intelligence construct in the brain., (© The Author(s) 2024. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.)
- Published
- 2024
- Full Text
- View/download PDF