Back to Search Start Over

A Hybrid Chinese Conversation model based on retrieval and generation

Authors :
Qing Tian
Huimin Yang
Najla Al-Nabhan
Tinghuai Ma
Yuan Tian
Source :
Future Generation Computer Systems. 114:481-490
Publication Year :
2021
Publisher :
Elsevier BV, 2021.

Abstract

Conversation generation is an important natural language processing task and has attracted much attention in recent years. The realization of the conversation model is also of great significance to the field of social computing, helping to build artificial intelligence robots on social networks. The open domain conversation model is fundamentally data-driven, which can be roughly divided into retrieval models and generation models. Although remarkable progress has been achieved in recent years, it is still difficult to get responses that are grammatically and semantically appropriate. We propose the Rerank of Retrieval-based and Transformer-based Conversation model (RRT), a novel conversation model that combines the retrieval model with the generation model for the purpose of obtaining context–appropriate response. The context–response pairs with the highest similarity from training set are retrieved using traditional retrieval method, and further ranked to obtain the retrieval candidate response. We replaced the traditional sequence-to-sequence models for conversation generation by the transformer model and achieved better results with less training time. Finally, the post-reranking module is used to rank the retrieved candidate and the generated one to obtain the final response. We conducted detailed experiments on two datasets and the results show that compared with the traditional generation model, our model has a significant improvement in each metric, and the training time is decreased by a factor of 5. Furthermore, our model is more informative and relevant to the input context than the retrieval model.

Details

ISSN :
0167739X
Volume :
114
Database :
OpenAIRE
Journal :
Future Generation Computer Systems
Accession number :
edsair.doi...........fb5f607fe3eab6eb39a845ef90a0aa37
Full Text :
https://doi.org/10.1016/j.future.2020.08.030