Back to Search Start Over

Alleviating over-smoothing via graph sparsification based on vertex feature similarity.

Authors :
Wu, Gongce
Lin, Shukuan
Zhuang, Yilin
Qiao, Jianzhong
Source :
Applied Intelligence; Sep2023, Vol. 53 Issue 17, p20223-20238, 16p
Publication Year :
2023

Abstract

In recent years, graph neural networks (GNNs) have developed rapidly. However, GNNs are difficult to deepen because of over-smoothing. This limits their applications. Starting from the relationship between graph sparsification and over-smoothing, for the problems existing in current graph sparsification methods, we propose a novel graph sparsification method SimSparse based on vertex feature similarity and theoretically prove that it can help to relieve the over-smoothing. Furthermore, we also propose its derivatives SimSparse-G (SimSparse with G umbel-Softmax) and SimSparse-GC (SimSparse-G with memory occupancy reduC tion), which can achieve end-to-end learning. Moreover, SimSparse-GC can apply to larger graphs. Based on the three graph sparsification methods, we further propose a general sparse-convolution block SparseConvBlock with a sparsification layer and a graph convolutional layer to construct deep GNNs in which over-smoothing can be better alleviated. Extensive experiments show that the GNNs constructed by SparseConvBlock have better performance when the numbers of the network layers increase (i.e., in deep GNNs). Furthermore, we verify that our graph sparsification methods help to relieve over-smoothing in deep GNNs. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0924669X
Volume :
53
Issue :
17
Database :
Complementary Index
Journal :
Applied Intelligence
Publication Type :
Academic Journal
Accession number :
171995003
Full Text :
https://doi.org/10.1007/s10489-023-04537-0