Back to Search Start Over

A Cooperative Framework with Generative Adversarial Networks and Entropic Auto-Encoders for Text Generation

Authors :
Zhiyue Liu
Jiahai Wang
Source :
IJCNN
Publication Year :
2021
Publisher :
IEEE, 2021.

Abstract

Generating text with high quality and sufficient diversity is a fundamental task in natural language generation. Although generative adversarial networks (GANs) achieve promising results in text generation, GAN-based language models suffer from mode collapse, i.e., the generator tends to sacrifice diversity and focus on limited text patterns with high quality. By contrast, maximum likelihood estimation (MLE) based language models could cover various text patterns and generate diversified samples with poor quality. This paper proposes a cooperative framework with GANs and entropic auto-encoders (EAEs), named GAN-EAE, to synthesize their advantages for text generation, where EAEs are powerful MLE-based generative models based on deterministic auto-encoders. By imitating the output distribution of EAEs, the generator shapes its output distribution closer to the real data distribution against mode collapse. Meanwhile, by learning the samples from the generator of GANs, EAEs subtly distribute probability mass on high quality patterns for improving generation quality. The similar samples obtained from the generator may raise mode collapse and should be downplayed during adversarial training. Thus, a sample re-weighting mechanism is adopted to improve diversity by measuring the inner distance of generated samples. Experimental results demonstrate that GAN-EAE could improve both GANs and EAEs to achieve state-of-the-art performance.

Details

Database :
OpenAIRE
Journal :
2021 International Joint Conference on Neural Networks (IJCNN)
Accession number :
edsair.doi...........749020c77213b41c5ab8f6b070715fce