Back to Search Start Over

BrainNPT: Pre-training of Transformer networks for brain network classification

Authors :
Hu, Jinlong
Huang, Yangmin
Wang, Nan
Dong, Shoubin
Hu, Jinlong
Huang, Yangmin
Wang, Nan
Dong, Shoubin
Publication Year :
2023

Abstract

Deep learning methods have advanced quickly in brain imaging analysis over the past few years, but they are usually restricted by the limited labeled data. Pre-trained model on unlabeled data has presented promising improvement in feature learning in many domains, including natural language processing and computer vision. However, this technique is under-explored in brain network analysis. In this paper, we focused on pre-training methods with Transformer networks to leverage existing unlabeled data for brain functional network classification. First, we proposed a Transformer-based neural network, named as BrainNPT, for brain functional network classification. The proposed method leveraged <cls> token as a classification embedding vector for the Transformer model to effectively capture the representation of brain network. Second, we proposed a pre-training framework for BrainNPT model to leverage unlabeled brain network data to learn the structure information of brain networks. The results of classification experiments demonstrated the BrainNPT model without pre-training achieved the best performance with the state-of-the-art models, and the BrainNPT model with pre-training strongly outperformed the state-of-the-art models. The pre-training BrainNPT model improved 8.75% of accuracy compared with the model without pre-training. We further compared the pre-training strategies, analyzed the influence of the parameters of the model, and interpreted the trained model.<br />Comment: Prepared to Submit

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1381622663
Document Type :
Electronic Resource