1. 融合指针网络的 Transformer 摘要 生成模型的改进.
- Author
-
李维乾 and 蒲程磊
- Subjects
- *
VOCABULARY , *MACHINE learning , *TASKS , *LANGUAGE & languages - Abstract
The traditional Encoder-Decoder model with attention mechanism has such problems as text redundancy, inconsistent representation and out of vocabulary (OOV) in the application of the summary task, resulting in low accuracy of the generated summary. The transformer model with ambeddable text location information was improved, pointer network was introduced to help with decoding, and take advantage of pointer network to generated text to generate summary. The effectiveness of the transformer model was verified on the LCSTS Chinese short text summary data set. The results show that the model outperforms the benchmark model by an average of two points in ROUGE scores, and the prominence of the generated content and the fluency of the language are significantly improved while ensuring the consistency of the summary with the input text. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF