Back to Search
Start Over
News headline generation based on improved decoder from transformer.
- Source :
-
Scientific reports [Sci Rep] 2022 Jul 08; Vol. 12 (1), pp. 11648. Date of Electronic Publication: 2022 Jul 08. - Publication Year :
- 2022
-
Abstract
- Most of the news headline generation models that use the sequence-to-sequence model or recurrent network have two shortcomings: the lack of parallel ability of the model and easily repeated generation of words. It is difficult to select the important words in news and reproduce these expressions, resulting in the headline that inaccurately summarizes the news. In this work, we propose a TD-NHG model, which stands for news headline generation based on an improved decoder from the transformer. The TD-NHG uses masked multi-head self-attention to learn the feature information of different representation subspaces of news texts and uses decoding selection strategy of top-k, top-p, and punishment mechanisms (repetition-penalty) in the decoding stage. We conducted a comparative experiment on the LCSTS dataset and CSTS dataset. Rouge-1, Rouge-2, and Rouge-L on the LCSTS dataset and CSTS dataset are 31.28/38.73, 12.68/24.97, and 28.31/37.47, respectively. The experimental results demonstrate that the proposed method can improve the accuracy and diversity of news headlines.<br /> (© 2022. The Author(s).)
- Subjects :
- Causality
Subjects
Details
- Language :
- English
- ISSN :
- 2045-2322
- Volume :
- 12
- Issue :
- 1
- Database :
- MEDLINE
- Journal :
- Scientific reports
- Publication Type :
- Academic Journal
- Accession number :
- 35804183
- Full Text :
- https://doi.org/10.1038/s41598-022-15817-z