Back to Search Start Over

A hierarchical framework based on transformer technology to achieve factual consistent and non-redundant abstractive text summarization.

Authors :
Swetha, G.
Kumar, S. Phani
Source :
Multimedia Tools & Applications; May2024, Vol. 83 Issue 16, p47587-47608, 22p
Publication Year :
2024

Abstract

Abstractive summarization is one of the popular topics that has been the researchers' attention for several years. This is because of the widespread application frameworks included in this field. Most of the existing summarization frameworks cannot provide effective abstracts as the contextual information of the input is not given importance. To deal with the problem, this work introduces a hierarchical framework using transformer technology to produce effective abstracts. The proposed framework includes preprocessing, extractive summarization, and abstractive summarization as the basic steps of the work. Initially, the input contents are preprocessed to obtain a clean document, and then the contents are provided to the extractive summarization unit. This unit consists of a fine-tuned BERTSum model (FTBS), which is a pre-trained model to produce the required extractive summary. The output is then provided to the proposed convolutional bidirectional gated recurrent unit transformer (CBi-GRUT) model, where an additional encoder model is introduced with the traditional transformer technology to obtain the output. The outcomes of the model are then assessed with the existing models to prove its efficacy, and the evaluations are carried out using the CNN/Daily Mail dataset. The proposed method achieved an average ROUGE-1 score of 0.78, average ROUGE-2 score of 0.68 and an average ROUGE-L score of 0.77. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13807501
Volume :
83
Issue :
16
Database :
Complementary Index
Journal :
Multimedia Tools & Applications
Publication Type :
Academic Journal
Accession number :
177079345
Full Text :
https://doi.org/10.1007/s11042-023-17426-y