Back to Search Start Over

融合标签信息的分层图注意力网络文本分类模型.

Authors :
杨春霞
马文文
徐 奔
韩 煜
Source :
Computer Engineering & Science / Jisuanji Gongcheng yu Kexue. Nov2023, Vol. 45 Issue 11, p2018-2026. 9p.
Publication Year :
2023

Abstract

Currently, there are two main limitations in single-label text classification tasks based on hierarchical graph attention networks. First, it cannot effectively extract text features. Second, there are few studies that highlight text features through the connection between text and labels. To address these two issues, a hierarchical graph attention network text classification model that integrates label information is proposed. The model constructs an adjacency matrix based on the relevance between sentence keywords and topics, and then uses word-level graph attention network to obtain vector representations of sentences. This method is based on randomly initialized target vectors and utilizes maximum pooling to extract specific target vectors for sentences, making the obtained sentence vectors have more obvious category features. After the word-level graph attention layer, a sentence-level graph attention network is used to obtain new text representations with word weight information, and pooling layers are used to obtain feature information for the text. On the other hand, GloVe pre-trained word vectors are used to initialize vector representations for all text labels, which are then interacted and fused with the feature information of the text to reduce the loss of original features, obtaining feature representations that are distinct from different texts. Experimental results on five public datasets (R52, R8, 20NG, Ohsumed, and MR) show that the classification accuracy of the model significantly exceeds other mainstream baseline models. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
1007130X
Volume :
45
Issue :
11
Database :
Academic Search Index
Journal :
Computer Engineering & Science / Jisuanji Gongcheng yu Kexue
Publication Type :
Academic Journal
Accession number :
173680331
Full Text :
https://doi.org/10.3969/j.issn.1007-130X.2023.11.013