Back to Search Start Over

Efficient Memory-Enhanced Transformer for Long-Document Summarization in Low-Resource Regimes.

Authors :
Moro G
Ragazzi L
Valgimigli L
Frisoni G
Sartori C
Marfia G
Source :
Sensors (Basel, Switzerland) [Sensors (Basel)] 2023 Mar 28; Vol. 23 (7). Date of Electronic Publication: 2023 Mar 28.
Publication Year :
2023

Abstract

Long document summarization poses obstacles to current generative transformer-based models because of the broad context to process and understand. Indeed, detecting long-range dependencies is still challenging for today's state-of-the-art solutions, usually requiring model expansion at the cost of an unsustainable demand for computing and memory capacities. This paper introduces Emma, a novel efficient memory-enhanced transformer-based architecture. By segmenting a lengthy input into multiple text fragments, our model stores and compares the current chunk with previous ones, gaining the capability to read and comprehend the entire context over the whole document with a fixed amount of GPU memory. This method enables the model to deal with theoretically infinitely long documents, using less than 18 and 13 GB of memory for training and inference, respectively. We conducted extensive performance analyses and demonstrate that Emma achieved competitive results on two datasets of different domains while consuming significantly less GPU memory than competitors do, even in low-resource settings.

Details

Language :
English
ISSN :
1424-8220
Volume :
23
Issue :
7
Database :
MEDLINE
Journal :
Sensors (Basel, Switzerland)
Publication Type :
Academic Journal
Accession number :
37050608
Full Text :
https://doi.org/10.3390/s23073542