1. A generative summarization model combined NLG and NLU.
- Author
-
Lv, Fangxing, Liu, Wenfeng, Yang, Yuzhen, Gao, Yaling, and Bao, Longqing
- Abstract
The automatic generation of natural language is a complex and essential task in text processing. This study proposes a novel approach to address this fundamental problem by leveraging an improved version of DST_BERT, a model that converts input text into a vector representation. Our key contribution lies in the joint optimization of two models,
NLU (Natural Language Under-standing) andNLG (Natural Language Generation), which enables us to obtain variable representations within a hidden space. This integration enhances the capabilities of bothNLU andNLG in generating coherent and contextually appropriate language. TheNLU andNLG models are seamlessly integrated with the hidden variable space, forming a generative representation model. To assess the effectiveness of our proposed approach, we conducted extensive experiments on the E2E and Weather datasets. The results highlight the state-of-the-art performance achieved by our model in generating natural language. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF