Back to Search Start Over

All in One: Multi-Task Prompting for Graph Neural Networks (Extended Abstract)

Authors :
Sun, Xiangguo
Cheng, Hong
Li, Jia
Liu, Bo
Guan, Jihong
Publication Year :
2024

Abstract

This paper is an extended abstract of our original work published in KDD23, where we won the best research paper award (Xiangguo Sun, Hong Cheng, Jia Li, Bo Liu, and Jihong Guan. All in one: Multi-task prompting for graph neural networks. KDD 23) The paper introduces a novel approach to bridging the gap between pre-trained graph models and the diverse tasks they're applied to, inspired by the success of prompt learning in NLP. Recognizing the challenge of aligning pre-trained models with varied graph tasks (node level, edge level, and graph level), which can lead to negative transfer and poor performance, we propose a multi-task prompting method for graphs. This method involves unifying graph and language prompt formats, enabling NLP's prompting strategies to be adapted for graph tasks. By analyzing the task space of graph applications, we reformulate problems to fit graph-level tasks and apply meta-learning to improve prompt initialization for multiple tasks. Experiments show our method's effectiveness in enhancing model performance across different graph tasks. Beyond the original work, in this extended abstract, we further discuss the graph prompt from a bigger picture and provide some of the latest work toward this area.<br />Comment: submitted to IJCAI 2024 Sister Conferences Track. The original paper can be seen at arXiv:2307.01504

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.07040
Document Type :
Working Paper