1. Automatic text summarization using transformer-based language models.
- Author
-
Rao, Ritika, Sharma, Sourabh, and Malik, Nitin
- Abstract
Automatic text summarization is a lucrative field in natural language processing (NLP). The amount of data flow has multiplied with the switch to digital. The massive datasets hold a wealth of knowledge and information must be extracted to be useful. This article focusses on creating an unmanned text summarizing structure that accepts text as data feeded into the system to outputs a summary using a cutting-edge machine learning model. Advancements in NLP led to the introduction of transformers in the field and their outstanding performance pulled a lot of attention towards them. The two transformer-based language models namely, Bidirectional and Auto-regressive Transformer (BART) and Text-To-Text Transfer Transformer (T5) were implemented on the CNN_dailymail dataset. BART outperforms T5 by 3.02% in ROUGE-1 Score. The model provides a worthier performance in comparison to the other models introduced in the existing literature for performing the same task. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF