Back to Search Start Over

Efficient and effective training of language and graph neural network models

Authors :
Ioannidis, Vassilis N.
Song, Xiang
Zheng, Da
Zhang, Houyu
Ma, Jun
Xu, Yi
Zeng, Belinda
Chilimbi, Trishul
Karypis, George
Ioannidis, Vassilis N.
Song, Xiang
Zheng, Da
Zhang, Houyu
Ma, Jun
Xu, Yi
Zeng, Belinda
Chilimbi, Trishul
Karypis, George
Publication Year :
2022

Abstract

Can we combine heterogenous graph structure with text to learn high-quality semantic and behavioural representations? Graph neural networks (GNN)s encode numerical node attributes and graph structure to achieve impressive performance in a variety of supervised learning tasks. Current GNN approaches are challenged by textual features, which typically need to be encoded to a numerical vector before provided to the GNN that may incur some information loss. In this paper, we put forth an efficient and effective framework termed language model GNN (LM-GNN) to jointly train large-scale language models and graph neural networks. The effectiveness in our framework is achieved by applying stage-wise fine-tuning of the BERT model first with heterogenous graph information and then with a GNN model. Several system and design optimizations are proposed to enable scalable and efficient training. LM-GNN accommodates node and edge classification as well as link prediction tasks. We evaluate the LM-GNN framework in different datasets performance and showcase the effectiveness of the proposed approach. LM-GNN provides competitive results in an Amazon query-purchase-product application.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333780081
Document Type :
Electronic Resource