Back to Search Start Over

An Empirical Study of Retrieval-enhanced Graph Neural Networks

Authors :
Wang, Dingmin
Liu, Shengchao
Wang, Hanchen
Grau, Bernardo Cuenca
Song, Linfeng
Tang, Jian
Le, Song
Liu, Qi
Publication Year :
2022

Abstract

Graph Neural Networks (GNNs) are effective tools for graph representation learning. Most GNNs rely on a recursive neighborhood aggregation scheme, named message passing, thereby their theoretical expressive power is limited to the first-order Weisfeiler-Lehman test (1-WL). An effective approach to this challenge is to explicitly retrieve some annotated examples used to enhance GNN models. While retrieval-enhanced models have been proved to be effective in many language and vision domains, it remains an open question how effective retrieval-enhanced GNNs are when applied to graph datasets. Motivated by this, we want to explore how the retrieval idea can help augment the useful information learned in the graph neural networks, and we design a retrieval-enhanced scheme called GRAPHRETRIEVAL, which is agnostic to the choice of graph neural network models. In GRAPHRETRIEVAL, for each input graph, similar graphs together with their ground-true labels are retrieved from an existing database. Thus they can act as a potential enhancement to complete various graph property predictive tasks. We conduct comprehensive experiments over 13 datasets, and we observe that GRAPHRETRIEVAL is able to reach substantial improvements over existing GNNs. Moreover, our empirical study also illustrates that retrieval enhancement is a promising remedy for alleviating the long-tailed label distribution problem.<br />Comment: Accepted by ECAI 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2206.00362
Document Type :
Working Paper