1. An attentive neural network based on deep learning approach for abstractive text summarization.
- Author
-
Sapra, Shruti J., Thakur, Shruti, Kapse, Avinash S., and Atique, Mohammad
- Subjects
ARTIFICIAL neural networks ,TEXT summarization ,MACHINE translating ,DEEP learning ,GENETIC transduction ,PARAPHRASE - Abstract
In recent times, abstract text summarization has made great strides by moving away from linear models based on sparse and manually-crafted features and towards nonlinear neural network models that take use of rich inputs. Deep learning models have proven effective in NLP applications because they can model complex data patterns without the need for human-created features. Learning sequentially has acknowledged a lot of consideration in the last several years. Because entire training of encoder-decoder neural systems in tasks like machine translation has proven successful, research using similar architectures in other transduction tasks, such as paraphrase creation or abstractive summarization, has developed. In this study, we provide a neural network based abstractive text summarizer. The attention method was implemented to fix the issue of processing lengthy sequences of input text. A dataset including two news summaries was used to train the model. The goal of the approach is to make use of a mechanism for directing focus at the sentence level to help with focus at the word level. The results are superior than those of competing models in the research. The experimental findings on both datasets demonstrate that the provided model efficiently increases ROUGE scores and provides a more concise summary of the original document without losing any key details. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF