Back to Search Start Over

URL: Universal Referential Knowledge Linking via Task-instructed Representation Compression

Authors :
Li, Zhuoqun
Lin, Hongyu
Wang, Tianshu
Cao, Boxi
Lu, Yaojie
Zhou, Weixiang
Wang, Hao
Zeng, Zhenyu
Sun, Le
Han, Xianpei
Li, Zhuoqun
Lin, Hongyu
Wang, Tianshu
Cao, Boxi
Lu, Yaojie
Zhou, Weixiang
Wang, Hao
Zeng, Zhenyu
Sun, Le
Han, Xianpei
Publication Year :
2024

Abstract

Linking a claim to grounded references is a critical ability to fulfill human demands for authentic and reliable information. Current studies are limited to specific tasks like information retrieval or semantic matching, where the claim-reference relationships are unique and fixed, while the referential knowledge linking (RKL) in real-world can be much more diverse and complex. In this paper, we propose universal referential knowledge linking (URL), which aims to resolve diversified referential knowledge linking tasks by one unified model. To this end, we propose a LLM-driven task-instructed representation compression, as well as a multi-view learning approach, in order to effectively adapt the instruction following and semantic understanding abilities of LLMs to referential knowledge linking. Furthermore, we also construct a new benchmark to evaluate ability of models on referential knowledge linking tasks across different scenarios. Experiments demonstrate that universal RKL is challenging for existing approaches, while the proposed framework can effectively resolve the task across various scenarios, and therefore outperforms previous approaches by a large margin.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438550327
Document Type :
Electronic Resource