Back to Search Start Over

Attention-based RNN with question-aware loss and multi-level copying mechanism for natural answer generation

Authors :
Fen Zhao
Huishuang Shao
Shuo Li
Yintong Wang
Yan Yu
Source :
Complex & Intelligent Systems, Vol 10, Iss 5, Pp 7249-7264 (2024)
Publication Year :
2024
Publisher :
Springer, 2024.

Abstract

Abstract Natural answer generation is in a very clear practical significance and strong application background, which can be widely used in the field of knowledge services such as community question answering and intelligent customer service. Traditional knowledge question answering is to provide precise answer entities and neglect the defects; namely, users hope to receive a complete natural answer. In this research, we propose a novel attention-based recurrent neural network for natural answer generation, which is enhanced with multi-level copying mechanisms and question-aware loss. To generate natural answers that conform to grammar, we leverage multi-level copying mechanisms and the prediction mechanism which can copy semantic units and predict common words. Moreover, considering the problem that the generated natural answer does not match the user question, question-aware loss is introduced to make the generated target answer sequences correspond to the question. Experiments on three response generation tasks show our model to be superior in quality while being more parallelizable and requiring significantly less time to train. Our model achieves 0.727 BLEU on the SimpleQuestions response generation task, improving over the existing best results by over 0.007 BLEU. Our model has scored a significant enhancement on naturalness with up to 0.05 more than best performing baseline. The simulation results show that our method can generate grammatical and contextual natural answers according to user needs.

Details

Language :
English
ISSN :
21994536 and 21986053
Volume :
10
Issue :
5
Database :
Directory of Open Access Journals
Journal :
Complex & Intelligent Systems
Publication Type :
Academic Journal
Accession number :
edsdoj.f901cc29dca14e688b9466c5331728d5
Document Type :
article
Full Text :
https://doi.org/10.1007/s40747-024-01538-5