Back to Search Start Over

Learn to Not Link: Exploring NIL Prediction in Entity Linking

Authors :
Zhu, Fangwei
Yu, Jifan
Jin, Hailong
Li, Juanzi
Hou, Lei
Sui, Zhifang
Publication Year :
2023

Abstract

Entity linking models have achieved significant success via utilizing pretrained language models to capture semantic features. However, the NIL prediction problem, which aims to identify mentions without a corresponding entity in the knowledge base, has received insufficient attention. We categorize mentions linking to NIL into Missing Entity and Non-Entity Phrase, and propose an entity linking dataset NEL that focuses on the NIL prediction problem. NEL takes ambiguous entities as seeds, collects relevant mention context in the Wikipedia corpus, and ensures the presence of mentions linking to NIL by human annotation and entity masking. We conduct a series of experiments with the widely used bi-encoder and cross-encoder entity linking models, results show that both types of NIL mentions in training data have a significant influence on the accuracy of NIL prediction. Our code and dataset can be accessed at https://github.com/solitaryzero/NIL_EL<br />Comment: ACL Findings 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.15725
Document Type :
Working Paper