Back to Search Start Over

Jointly Optimizing Diversity and Relevance in Neural Response Generation

Authors :
Gao, Xiang
Lee, Sungjin
Zhang, Yizhe
Brockett, Chris
Galley, Michel
Gao, Jianfeng
Dolan, Bill
Publication Year :
2019

Abstract

Although recent neural conversation models have shown great potential, they often generate bland and generic responses. While various approaches have been explored to diversify the output of the conversation model, the improvement often comes at the cost of decreased relevance. In this paper, we propose a SpaceFusion model to jointly optimize diversity and relevance that essentially fuses the latent space of a sequence-to-sequence model and that of an autoencoder model by leveraging novel regularization terms. As a result, our approach induces a latent space in which the distance and direction from the predicted response vector roughly match the relevance and diversity, respectively. This property also lends itself well to an intuitive visualization of the latent space. Both automatic and human evaluation results demonstrate that the proposed approach brings significant improvement compared to strong baselines in both diversity and relevance.<br />Comment: Long paper accepted at NAACL 2019

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1902.11205
Document Type :
Working Paper