1. Multi-modal sentiment classification based on graph neural network and multi-head cross-attention mechanism for education emotion analysis
- Author
-
Zhiguang Liu, Guoyin Hao, Fengshuai Li, Xiaoqing He, and Yuanheng Zhang
- Subjects
multi-modal sentiment classification ,graph neural network ,multi-head cross-attention mechanism ,education emotion analysis ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Chemical engineering ,TP155-156 ,Physics ,QC1-999 - Abstract
One of the key tasks of multi-modal sentiment classification is to accurately extract and fuse complementary information from two different modes of text and vision in order to detect the affective tendency of the mentioned aspect words in the text. Most of the existing methods only use a single context information combined with picture information for analysis, and there are problems such as insensitive recognition of the correlation between aspects, context information and visual information, and insufficient local extraction of aspects related information in vision. In addition, when the feature fusion is carried out, some modal information is not sufficient, which leads to the general fusion effect. In order to fully carry out fine-grained information interaction between multiple modes, a multi-modal sentiment classification based on graph neural network and multi-head cross-attention mechanism for education emotion analysis is proposed in this paper. Firstly, cross-attention is used to obtain the global representation of aspect-oriented objects in text and images. Then a multi-modal interaction graph is established to connect the local and global representation nodes of different modes. Finally, the graph attention network is used to fully integrate the features in the two granularity. Numerous experiments on popular multi-modal sentiment analysis datasets demonstrate the advantages of the proposed framework in this paper compared to state-of-the-art methods.
- Published
- 2024
- Full Text
- View/download PDF