Back to Search Start Over

Long and Diverse Text Generation with Planning-based Hierarchical Variational Model

Authors :
Shao, Zhihong
Huang, Minlie
Wen, Jiangtao
Xu, Wenfei
Zhu, Xiaoyan
Publication Year :
2019

Abstract

Existing neural methods for data-to-text generation are still struggling to produce long and diverse texts: they are insufficient to model input data dynamically during generation, to capture inter-sentence coherence, or to generate diversified expressions. To address these issues, we propose a Planning-based Hierarchical Variational Model (PHVM). Our model first plans a sequence of groups (each group is a subset of input items to be covered by a sentence) and then realizes each sentence conditioned on the planning result and the previously generated context, thereby decomposing long text generation into dependent sentence generation sub-tasks. To capture expression diversity, we devise a hierarchical latent structure where a global planning latent variable models the diversity of reasonable planning and a sequence of local latent variables controls sentence realization. Experiments show that our model outperforms state-of-the-art baselines in long and diverse text generation.<br />Comment: To appear in EMNLP 2019

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1908.06605
Document Type :
Working Paper