1. 基于 BERT 与注意力机制的方面级隐式情感分析模型.
- Author
-
杨春霞, 韩煜, 陈启岗, and 马文文
- Abstract
There are quite a few comment sentences without emotional words in aspect-level emotional texts, and the study of their emotions is called aspect-level implicit sentiment analysis. The existing models have the problems that the context information related to aspect words may be lost in the pre-training process, and the deep features in the context cannot be accurately extracted. Aiming at the first problem, this paper constructs an aspect-aware BERT pre-training model, and introduces aspect words into the input embedding structure of basic BERT to generate word vectors related to aspect words. Aiming at the second problem, this paper constructs a context-aware attention mechanism. For the deep hidden vectors obtained from the coding layer, the semantic and syntactic information is introduced into the attention weight calculation process, so that the attention mechanism can more accurately assign weights to the context related to aspect words. The results of comparative experiments show that the proposed model outperforms the baseline model. BERT [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF