Back to Search Start Over

Node Embedding and Classification with Adaptive Structural Fingerprint.

Authors :
Zhu, Yaokang
Wang, Jun
Zhang, Jie
Zhang, Kai
Source :
Neurocomputing. Sep2022, Vol. 502, p196-208. 13p.
Publication Year :
2022

Abstract

• Graph attention network (GAT) is a promising framework for message passing on graphs. • Increasing the attention range to more than one-hop neighbors can negatively affect the performance of GAT, reflecting the over-smoothing risk of graph neural networks in general. • The key idea is to contextualize each node with its "structural fingerprint" that can automatically adjust to the local graph topology and edge connections in the neighborhood of the node. • Our approach provides a useful platform for different subspace of node features and various spatial scale of graph structures to "cross-talk" with each other through multi-head attention, which is more flexible than existing attention mechanism using a fixed, minimal spatial attention scale. Graph attention network (GAT) is a promising framework for message passing on graphs, but how to exploit rich, high-order structural information in the attention mechanism is still an open challenge. Furthermore, increasing the attention range to more than one-hop neighbors can negatively affect the performance of GAT, reflecting the over-smoothing risk of graph neural networks in general. In this paper, we propose an "adaptive structural fingerprint" model to fully exploit complex graph topology in graph attention networks. The key idea is to contextualize each node with its "structural fingerprint" that can automatically adjust to the local graph topology and edge connections in the neighborhood of the node. By doing this, structural interactions between the nodes can be evaluated more accurately and better confined to relevant neighbors, thus contributing to an improved attention mechanism and clearer cluster boundary. Furthermore, our approach provides a useful platform for different subspace of node features and various spatial scale of graph structures to "cross-talk" with each other through multi-head attention, which is more flexible than existing attention mechanism using a fixed, minimal spatial attention scale. Encouraging results are observed on a number of benchmark data sets including citation and social networks. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
502
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
158157550
Full Text :
https://doi.org/10.1016/j.neucom.2022.05.073