Back to Search Start Over

An adaptive multi-graph neural network with multimodal feature fusion learning for MDD detection.

Authors :
Xing, Tao
Dou, Yutao
Chen, Xianliang
Zhou, Jiansong
Xie, Xiaolan
Peng, Shaoliang
Source :
Scientific Reports. 11/18/2024, Vol. 14 Issue 1, p1-14. 14p.
Publication Year :
2024

Abstract

Major Depressive Disorder (MDD) is an affective disorder that can lead to persistent sadness and a decline in the quality of life, increasing the risk of suicide. Utilizing multimodal data such as electroencephalograms and patient interview audios can facilitate the timely detection of MDD. However, existing depression detection methods either consider only a single modality or do not fully account for the differences and similarities between modalities in multimodal approaches, potentially overlooking the latent information inherent in various modal data. To address these challenges, we propose EMO-GCN, a multimodal depression detection method based on an adaptive multi-graph neural network. By employing graph-based methods to model data from various modalities and extracting features from them, the potential correlations between modalities are uncovered. The model's performance on the MODMA dataset is outstanding, achieving an accuracy (ACC) of 96.30%. Ablation studies further confirm the effectiveness of the model's individual components.The experimental results of EMO-GCN demonstrate the application prospects of graph-based multimodal analysis in the field of mental health, offering new perspectives for future research. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20452322
Volume :
14
Issue :
1
Database :
Academic Search Index
Journal :
Scientific Reports
Publication Type :
Academic Journal
Accession number :
180936060
Full Text :
https://doi.org/10.1038/s41598-024-79981-0