Back to Search Start Over

MKGL: Mastery of a Three-Word Language

Authors :
Guo, Lingbing
Bo, Zhongpu
Chen, Zhuo
Zhang, Yichi
Chen, Jiaoyan
Lan, Yarong
Sun, Mengshu
Zhang, Zhiqiang
Luo, Yangyifei
Li, Qian
Zhang, Qiang
Zhang, Wen
Chen, Huajun
Publication Year :
2024

Abstract

Large language models (LLMs) have significantly advanced performance across a spectrum of natural language processing (NLP) tasks. Yet, their application to knowledge graphs (KGs), which describe facts in the form of triplets and allow minimal hallucinations, remains an underexplored frontier. In this paper, we investigate the integration of LLMs with KGs by introducing a specialized KG Language (KGL), where a sentence precisely consists of an entity noun, a relation verb, and ends with another entity noun. Despite KGL's unfamiliar vocabulary to the LLM, we facilitate its learning through a tailored dictionary and illustrative sentences, and enhance context understanding via real-time KG context retrieval and KGL token embedding augmentation. Our results reveal that LLMs can achieve fluency in KGL, drastically reducing errors compared to conventional KG embedding methods on KG completion. Furthermore, our enhanced LLM shows exceptional competence in generating accurate three-word sentences from an initial entity and interpreting new unseen terms out of KGs.<br />Comment: NeurIPS 2024 (spotlight)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.07526
Document Type :
Working Paper