Back to Search
Start Over
Extracting nonlinear neural topics with neural variational bayes
- Source :
- World Wide Web. 25:131-149
- Publication Year :
- 2021
- Publisher :
- Springer Science and Business Media LLC, 2021.
-
Abstract
- Recently, topic modeling has been upgraded by neural variational inference, which simultaneously allows the model structures deeper and proposes efficient update rules with the reparameterization trick. We formally call this recent new art as neural topic model. In this paper, we investigate a problem of neural topic models, where they formulate topic embeddings and measure the word weights within topics by linear transformation between topic and word embeddings, resulting in redundant and inaccurate topic representations. To solve this problem, we propose a novel neural topic model, namely G enerative M odel with N onlinear N eural T opics (GMnnt). The insight of GMnnt is to replace the topic embeddings with neural networks of topics, named neural topic, so as to capture nonlinear relationships between words in the embedding space, enabling to induce more accurate topic representations. We derive the inference process of GMnnt under the framework of neural variational inference. Extensive empirical studies have been conducted on several widely used collections of documents, including datasets of both short texts and normal long texts. The experimental results validate that GMnnt can output more semantically coherent topics compared with traditional topic models and neural topic models.
Details
- ISSN :
- 15731413 and 1386145X
- Volume :
- 25
- Database :
- OpenAIRE
- Journal :
- World Wide Web
- Accession number :
- edsair.doi...........fb9056fdfa63736440986e8f8ab5f0a2