Back to Search Start Over

结合依存句法分析与交互注意力 机制的隐式方面提取.

Authors :
汪兰兰
姚春龙
李 旭
于晓强
Source :
Application Research of Computers / Jisuanji Yingyong Yanjiu. Jan2022, Vol. 39 Issue 1, p37-42. 6p.
Publication Year :
2022

Abstract

Implicit aspect extraction is important for improving the accuracy of fine-grained sentiment analysis. However, existing implicit aspect extraction techniques do not have strong generalization ability when dealing with large-scale data. To address the problem, this paper proposed an implicit aspect extraction model combining dependency syntactic parsing and interactive attention mechanism. First, the model generated the initial representation of the text by the pre-trained language model BERT. Then, it passed the initial representation to the self-attention layer guided by the dependency syntactic parsing. Due to the interactive attention mechanism, the model further extracted the results of the above two processes. Finally it used a classifier to determine the implicit aspect of the sentence. Compared with baseline BERT and other deep neural network models, the proposed model has achieved higher F1 and AUC on the enhanced SemEval implicit aspect dataset, which proves the effectiveness of the model. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
10013695
Volume :
39
Issue :
1
Database :
Academic Search Index
Journal :
Application Research of Computers / Jisuanji Yingyong Yanjiu
Publication Type :
Academic Journal
Accession number :
154623751
Full Text :
https://doi.org/10.19734/j.issn.1001-3695.2021.06.0249