Back to Search Start Over

Enhancing relation extraction using multi-task learning with SDP evidence.

Authors :
Wang, Hailin
Zhang, Dan
Liu, Guisong
Huang, Li
Qin, Ke
Source :
Information Sciences. Jun2024, Vol. 670, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Relation extraction (RE) is a crucial subtask of information extraction, which involves recognizing the relation between entity pairs in a sentence. Previous studies have extensively employed syntactic information, notably the shortest dependency path (SDP), to collect word evidence, termed SDP evidence, which gives clues about the given entity pair, thus improving RE. Nevertheless, prevalent transformer-based techniques lack syntactic information and cannot effectively model essential syntactic clues to support relations. This study exerts multi-task learning to address these issues by imbibing an SDP token position prediction task into the RE task. To this end, we introduce SGA, an SDP evidence guiding approach that transfers the SDP evidence into two novel supervisory signal labels: SDP tokens label and SDP matrix label. The former guides the attention modules to assign high attention weights to SDP token positions, emphasizing relational clues. In the meantime, the latter supervises SGA to predict a parameterized asymmetric product matrix among the SDP tokens for RE. Experimental outcomes demonstrate the model's enhanced ability to leverage SDP information, thereby directing attention modules and predicted matrix labels to focus on SDP evidence. Consequently, our proposed approach surpasses existing publicly available optimal baselines across four RE datasets: SemEval2010-Task8, KBP37, NYT, and WebNLG.1 • Research highlights 1 (Leveraging syntactic info): Multi-task learning improves RE by predicting SDP token positions and capturing valuable semantic relationships. • Research highlights 2 (Novel supervisory labels): Introducing SDP tokens and SDP matrix labels enhances model focus on critical tokens, improving syntactic knowledge and predictions. • Research highlights 3 (Comparable performance): SGA achieves state-of-the-art results in micro F1-score on SemEval2010-Task8 and KBP37 datasets and surpasses other models on NYT and WebNLG datasets, highlighting its efficacy in incorporating syntactic information for RE tasks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
670
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
177026806
Full Text :
https://doi.org/10.1016/j.ins.2024.120610