Back to Search Start Over

Train Your Own GNN Teacher: Graph-Aware Distillation on Textual Graphs

Authors :
Mavromatis, Costas
Ioannidis, Vassilis N.
Wang, Shen
Zheng, Da
Adeshina, Soji
Ma, Jun
Zhao, Han
Faloutsos, Christos
Karypis, George
Publication Year :
2023

Abstract

How can we learn effective node representations on textual graphs? Graph Neural Networks (GNNs) that use Language Models (LMs) to encode textual information of graphs achieve state-of-the-art performance in many node classification tasks. Yet, combining GNNs with LMs has not been widely explored for practical deployments due to its scalability issues. In this work, we tackle this challenge by developing a Graph-Aware Distillation framework (GRAD) to encode graph structures into an LM for graph-free, fast inference. Different from conventional knowledge distillation, GRAD jointly optimizes a GNN teacher and a graph-free student over the graph's nodes via a shared LM. This encourages the graph-free student to exploit graph information encoded by the GNN teacher while at the same time, enables the GNN teacher to better leverage textual information from unlabeled nodes. As a result, the teacher and the student models learn from each other to improve their overall performance. Experiments in eight node classification benchmarks in both transductive and inductive settings showcase GRAD's superiority over existing distillation approaches for textual graphs.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2304.10668
Document Type :
Working Paper