Back to Search Start Over

A Hierarchical N-Gram Framework for Zero-Shot Link Prediction

Authors :
Li, Mingchen
Chen, Junfan
Mensah, Samuel
Aletras, Nikolaos
Yang, Xiulong
Ye, Yang
Publication Year :
2022

Abstract

Due to the incompleteness of knowledge graphs (KGs), zero-shot link prediction (ZSLP) which aims to predict unobserved relations in KGs has attracted recent interest from researchers. A common solution is to use textual features of relations (e.g., surface name or textual descriptions) as auxiliary information to bridge the gap between seen and unseen relations. Current approaches learn an embedding for each word token in the text. These methods lack robustness as they suffer from the out-of-vocabulary (OOV) problem. Meanwhile, models built on character n-grams have the capability of generating expressive representations for OOV words. Thus, in this paper, we propose a Hierarchical N-Gram framework for Zero-Shot Link Prediction (HNZSLP), which considers the dependencies among character n-grams of the relation surface name for ZSLP. Our approach works by first constructing a hierarchical n-gram graph on the surface name to model the organizational structure of n-grams that leads to the surface name. A GramTransformer, based on the Transformer is then presented to model the hierarchical n-gram graph to construct the relation embedding for ZSLP. Experimental results show the proposed HNZSLP achieved state-of-the-art performance on two ZSLP datasets.<br />Comment: Published as a conference paper at EMNLP Findings 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2204.10293
Document Type :
Working Paper