1. ExGAT: Context extended graph attention neural network.
- Author
-
Quan P, Zheng L, Zhang W, Xiao Y, Niu L, and Shi Y
- Subjects
- Algorithms, Humans, Computer Simulation, Neural Networks, Computer, Attention physiology
- Abstract
As an essential concept in attention, context defines the overall scope under consideration. In attention-based GNNs, context becomes the set of representation nodes of graph embedding. Current approaches choose immediate neighbors of the target or its subset as the context, which limits the ability of attention to capture long-distance dependency. To address this deficiency, we propose a novel attention-based GNN framework with extended contexts. Concretely, multi-hop nodes are first selected for context expansion according to information transferability and the number of hops. Then, to reduce the computational cost and fit the graph representation learning process, two heuristic context refinement policies are designed by focusing on local graph structure. One is for the graphs with high degrees, multi-hop neighbors with fewer connections to the target are removed to acquire accurate diffused information. The other is for the graphs with low degrees or uniform degree distribution, low-transferability neighbors are dislodged to ensure the graph locality is not obscured by the global information induced by the extended context. Finally, multi-head attention is employed in the refined context. Numerical comparisons with 23 baselines demonstrate the superiority of our method. Extensive model analysis shows that extending context with the informative multi-hop neighbors properly indeed promotes the performance of attention-based GNNs., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2024 Elsevier Ltd. All rights reserved.)
- Published
- 2025
- Full Text
- View/download PDF