Back to Search
Start Over
GLSE: Global-Local Selective Encoding for Response Generation in Neural Conversation Model
- Source :
- ICTAI
- Publication Year :
- 2019
- Publisher :
- IEEE, 2019.
-
Abstract
- How to generate relevant and informative response is one of the core topics in response generation area. Following the task formulation of neural machine translation, previous works mainly consider response generation task as a mapping from a source sentence to a target sentence. However, the dialogue model tends to generate safe, commonplace responses (e.g., I don't know) regardless of the input, when learning to maximize the likelihood of response for the given message in an almost loss-less manner just like MT. Different from existing works, we propose a Global-Local Selective Encoding model (GLSE) to extend the seq2seq framework to generate more relevant and informative responses. Specifically, two types of selective gate network are introduced in this work: (i) A local selective word-sentence gate is added after encoding phase of Seq2Seq learning framework, which learns to tailor the original message information and generates a selected input representation. (ii) A global selective bidirectional-context gate is set to control the bidirectional information flow from a BiGRU based encoder to decoder. Empirical studies indicate the advantage of our model over several classical and strong baselines.
- Subjects :
- Machine translation
Computer science
business.industry
Speech recognition
Deep learning
media_common.quotation_subject
Information flow
computer.software_genre
Encoding (memory)
Conversation
Artificial intelligence
Representation (mathematics)
business
Set (psychology)
Encoder
computer
Sentence
media_common
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI)
- Accession number :
- edsair.doi...........07673c9ee38a6ec6480162ddff9f4bd3
- Full Text :
- https://doi.org/10.1109/ictai.2019.00166