Back to Search Start Over

Attention-Based Transformer-BiGRU for Question Classification.

Authors :
Han, Dongfang
Tohti, Turdi
Hamdulla, Askar
Source :
Information (2078-2489); May2022, Vol. 13 Issue 5, p214-214, 22p
Publication Year :
2022

Abstract

A question answering (QA) system is a research direction in the field of artificial intelligence and natural language processing (NLP) that has attracted much attention and has broad development prospects. As one of the main components in the QA system, the accuracy of question classification plays a key role in the entire QA task. Therefore, not only the traditional machine learning methods but also today's deep learning methods are widely used and deeply studied in question classification tasks. This paper mainly introduces our work on two aspects of Chinese question classification. The first is to use an answer-driven method to build a richer Chinese question classification dataset for the small-scale problems of the existing experimental dataset, which has a certain reference value for the expansion of the dataset, especially for the construction of those low-resource language datasets. The second is to propose a deep learning model of problem classification with a Transformer + Bi-GRU + Attention structure. Transformer has strong learning and coding ability, but it adopts the scheme of fixed coding length, which divides the long text into multiple segments, and each segment is coded separately; there is no interaction that occurs between segments. Here, we achieve the information interaction between segments through Bi-GRU so as to improve the coding effect of long sentences. Our purpose of adding the Attention mechanism is to highlight the key semantics in questions that contain answers. The experimental results show that the model proposed in this paper has significantly improved the accuracy of question classification. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20782489
Volume :
13
Issue :
5
Database :
Complementary Index
Journal :
Information (2078-2489)
Publication Type :
Academic Journal
Accession number :
157238914
Full Text :
https://doi.org/10.3390/info13050214