Back to Search Start Over

On the Utility of Combining Topic Models and Recurrent Neural Networks

Authors :
Corey W. Arnold
Jae Lee
Justin Wood
Bohan Li
Wei Wang
Source :
Lecture Notes in Networks and Systems ISBN: 9783030797560
Publication Year :
2021
Publisher :
Springer International Publishing, 2021.

Abstract

Learning under deep neural network models has shown interesting abilities to generate semi-structured sequences. Such generated fragments can appear to be remarkably like real data, yet applications of these generative techniques are generally lacking. This is partly due to the inability of recurrent neural network sequences to be semantically connected amongst long related fragments. One way to increase the semantics of these fragments is to combine features learned from topic modeling with those of recurrent neural networks. This combination ensures a higher correlation between generated sequences, much like real data. In this paper we investigate existing approaches to adding topics into the recurrent neural network. We then develop a novel approach called Topic-RNN, which represents the state-of-the-art technique of integrating topics into the RNN. Our results show this method outperforms existing approaches substantially. Additionally, we show that in such cases where topic information is available, Topic-RNN surpasses word level RNNs for the task of word prediction.

Details

ISBN :
978-3-030-79756-0
ISBNs :
9783030797560
Database :
OpenAIRE
Journal :
Lecture Notes in Networks and Systems ISBN: 9783030797560
Accession number :
edsair.doi...........06b7a41fc0ed51a8c61bafb8c597cbc4