Back to Search Start Over

NASE: Learning Knowledge Graph Embedding for Link Prediction via Neural Architecture Search

Authors :
Yan Zhang
Xiaoyu Kou
Bingfeng Luo
Huang Hu
Source :
CIKM
Publication Year :
2020

Abstract

Link prediction is the task of predicting missing connections between entities in the knowledge graph (KG). While various forms of models are proposed for the link prediction task, most of them are designed based on a few known relation patterns in several well-known datasets. Due to the diversity and complexity nature of the real-world KGs, it is inherently difficult to design a model that fits all datasets well. To address this issue, previous work has tried to use Automated Machine Learning (AutoML) to search for the best model for a given dataset. However, their search space is limited only to bilinear model families. In this paper, we propose a novel Neural Architecture Search (NAS) framework for the link prediction task. First, the embeddings of the input triplet are refined by the Representation Search Module. Then, the prediction score is searched within the Score Function Search Module. This framework entails a more general search space, which enables us to take advantage of several mainstream model families, and thus it can potentially achieve better performance. We relax the search space to be continuous so that the architecture can be optimized efficiently using gradient-based search strategies. Experimental results on several benchmark datasets demonstrate the effectiveness of our method compared with several state-of-the-art approaches.<br />Accepted by CIKM 2020, short paper

Details

Language :
English
Database :
OpenAIRE
Journal :
CIKM
Accession number :
edsair.doi.dedup.....99f6f3d5ecf4d456d9a48c212d9d262e