Back to Search Start Over

Efficient Relational Sentence Ordering Network.

Authors :
Li, Yingming
Cui, Baiyun
Zhang, Zhongfei
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Oct2022, Vol. 44 Issue 10, p6169-6183. 15p.
Publication Year :
2022

Abstract

In this paper, we propose a novel deep Efficient Relational Sentence Ordering Network (referred to as ERSON) by leveraging pre-trained language model in both encoder and decoder architectures to strengthen the coherence modeling of the entire model. Specifically, we first introduce a divide-and-fuse BERT (referred to as DF-BERT), a new refactor of BERT network, where lower layers in the improved model encode each sentence in the paragraph independently, which are shared by different sentence pairs, and the higher layers learn the cross-attention between sentence pairs jointly. It enables us to capture the semantic concepts and contextual information between the sentences of the paragraph, while significantly reducing the runtime and memory consumption without sacrificing the model performance. Besides, a Relational Pointer Decoder (referred to as RPD) is developed, which utilizes the pre-trained Next Sentence Prediction (NSP) task of BERT to capture the useful relative ordering information between sentences to enhance the order predictions. In addition, a variety of knowledge distillation based losses are added as auxiliary supervision to further improve the ordering performance. The extensive evaluations on Sentence Ordering, Order Discrimination, and Multi-Document Summarization tasks show the superiority of ERSON to the state-of-the-art ordering methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
44
Issue :
10
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
159210528
Full Text :
https://doi.org/10.1109/TPAMI.2021.3085738