Back to Search Start Over

OpenGSL: A Comprehensive Benchmark for Graph Structure Learning

Authors :
Zhou, Zhiyao
Zhou, Sheng
Mao, Bochao
Zhou, Xuanyi
Chen, Jiawei
Tan, Qiaoyu
Zha, Daochen
Feng, Yan
Chen, Chun
Wang, Can
Publication Year :
2023

Abstract

Graph Neural Networks (GNNs) have emerged as the de facto standard for representation learning on graphs, owing to their ability to effectively integrate graph topology and node attributes. However, the inherent suboptimal nature of node connections, resulting from the complex and contingent formation process of graphs, presents significant challenges in modeling them effectively. To tackle this issue, Graph Structure Learning (GSL), a family of data-centric learning approaches, has garnered substantial attention in recent years. The core concept behind GSL is to jointly optimize the graph structure and the corresponding GNN models. Despite the proposal of numerous GSL methods, the progress in this field remains unclear due to inconsistent experimental protocols, including variations in datasets, data processing techniques, and splitting strategies. In this paper, we introduce OpenGSL, the first comprehensive benchmark for GSL, aimed at addressing this gap. OpenGSL enables a fair comparison among state-of-the-art GSL methods by evaluating them across various popular datasets using uniform data processing and splitting strategies. Through extensive experiments, we observe that existing GSL methods do not consistently outperform vanilla GNN counterparts. We also find that there is no significant correlation between the homophily of the learned structure and task performance, challenging the common belief. Moreover, we observe that the learned graph structure demonstrates a strong generalization ability across different GNN models, despite the high computational and space consumption. We hope that our open-sourced library will facilitate rapid and equitable evaluation and inspire further innovative research in this field. The code of the benchmark can be found in https://github.com/OpenGSL/OpenGSL.<br />Comment: 25 pages, 21 figures. Camera-ready version for NeurIPS Datasets and Benchmarks Track 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2306.10280
Document Type :
Working Paper