Back to Search
Start Over
Reducing repetition in convolutional abstractive summarization.
- Source :
- Natural Language Engineering; Jan2023, Vol. 29 Issue 1, p81-109, 29p
- Publication Year :
- 2023
-
Abstract
- Convolutional sequence to sequence (CNN seq2seq) models have met success in abstractive summarization. However, their outputs often contain repetitive word sequences and logical inconsistencies, limiting the practicality of their application. In this paper, we find the reasons behind the repetition problem in CNN-based abstractive summarization through observing the attention map between the summaries with repetition and their corresponding source documents and mitigate the repetition problem. We propose to reduce the repetition in summaries by attention filter mechanism (ATTF) and sentence-level backtracking decoder (SBD), which dynamically redistributes attention over the input sequence as the output sentences are generated. The ATTF can record previously attended locations in the source document directly and prevent the decoder from attending to these locations. The SBD prevents the decoder from generating similar sentences more than once via backtracking at test. The proposed model outperforms the baselines in terms of ROUGE score, repeatedness, and readability. The results show that this approach generates high-quality summaries with minimal repetition and makes the reading experience better. [ABSTRACT FROM AUTHOR]
- Subjects :
- DEEP learning
SHIFT registers
Subjects
Details
- Language :
- English
- ISSN :
- 13513249
- Volume :
- 29
- Issue :
- 1
- Database :
- Complementary Index
- Journal :
- Natural Language Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 161606704
- Full Text :
- https://doi.org/10.1017/S1351324921000309