Back to Search Start Over

GraphAlign: Pretraining One Graph Neural Network on Multiple Graphs via Feature Alignment

Authors :
Hou, Zhenyu
Li, Haozhan
Cen, Yukuo
Tang, Jie
Dong, Yuxiao
Publication Year :
2024

Abstract

Graph self-supervised learning (SSL) holds considerable promise for mining and learning with graph-structured data. Yet, a significant challenge in graph SSL lies in the feature discrepancy among graphs across different domains. In this work, we aim to pretrain one graph neural network (GNN) on a varied collection of graphs endowed with rich node features and subsequently apply the pretrained GNN to unseen graphs. We present a general GraphAlign method that can be seamlessly integrated into the existing graph SSL framework. To align feature distributions across disparate graphs, GraphAlign designs alignment strategies of feature encoding, normalization, alongside a mixture-of-feature-expert module. Extensive experiments show that GraphAlign empowers existing graph SSL frameworks to pretrain a unified and powerful GNN across multiple graphs, showcasing performance superiority on both in-domain and out-of-domain graphs.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.02953
Document Type :
Working Paper