Back to Search Start Over

Coreference Resolution Based on High-Dimensional Multi-Scale Information.

Authors :
Wang, Yu
Ding, Zenghui
Wang, Tao
Xu, Shu
Yang, Xianjun
Sun, Yining
Source :
Entropy. Jun2024, Vol. 26 Issue 6, p529. 15p.
Publication Year :
2024

Abstract

Coreference resolution is a key task in Natural Language Processing. It is difficult to evaluate the similarity of long-span texts, which makes text-level encoding somewhat challenging. This paper first compares the impact of commonly used methods to improve the global information collection ability of the model on the BERT encoding performance. Based on this, a multi-scale context information module is designed to improve the applicability of the BERT encoding model under different text spans. In addition, improving linear separability through dimension expansion. Finally, cross-entropy loss is used as the loss function. After adding BERT and span BERT to the module designed in this article, F1 increased by 0.5% and 0.2%, respectively. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
26
Issue :
6
Database :
Academic Search Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
178154109
Full Text :
https://doi.org/10.3390/e26060529