1. RSGNN: residual structure graph neural network.
- Author
-
Chen, Shuang, Zhang, Changlun, Gu, Fan, and Wang, Haochen
- Abstract
Compared to conventional artificial neural networks, Graph Neural Networks (GNNs) better handle graph-structured data. Graph topology plays an important role in learning graph representations and impacts the performance of GNNs. However, existing GNNs encounter challenges in adequately capturing and representing the entire graph topology. In order to better capture the information about topological graph structures during message-passing, we propose a novel GNN architecture called Residual Structure Graph Neural Network (RSGNN). Specifically, RSGNN constructs residual links on local subgraphs to express the potential relationships between nodes, thus compensating for the lack of structural information solely conveyed by real edge connections. Meanwhile, the influence of edge structures of neighbor nodes is considered. We conduct comprehensive experiments on various graph benchmark datasets to evaluate the efficacy of the proposed RSGNN model. The experimental results demonstrate that our model outperforms existing state-of-the-art methods and alleviates the over-smoothing issue. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF