Back to Search Start Over

A two-phase knowledge distillation model for graph convolutional network-based recommendation.

Authors :
Zhenhua Huang
Zuorui Lin
Zheng Gong
Yunwen Chen
Yong Tang
Source :
International Journal of Intelligent Systems; Sep2022, Vol. 37 Issue 9, p5902-5923, 22p
Publication Year :
2022

Abstract

Graph convolutional network (GCN)-based recommendation has recently attracted significant attention in the recommender system community. Although current studies propose various GCNs to improve recommendation performance, existing methods suffer from two main limitations. First, user-item interaction data is generally sparse in practice, highlighting these methods' ineffectiveness in learning user and item feature representations. Second, they usually perform a dot-product operation to model and calculate user preferences on items, leading to inaccurate user preference learning. To address these limitations, this study adopts a design idea that sharply differs from existing works. Specifically, we introduce the knowledge distillation concept into GCN-based recommendation and propose a two-phase knowledge distillation model (TKDM) improving recommendation performance. In Phase I, a self-distillation method on a graph auto-encoder learns the user and item feature representations. This auto-encoder employs a simple twolayer GCN as an encoder and a fully connected layer as a decoder. On this basis, in Phase II, a mutual-distillation method on a fully connected layer is introduced to learn user preferences on items with triple-based Bayesian personalized ranking. Extensive experiments on three realworld data sets demonstrate that TKDM outperforms classic and state-of-the-art methods related to GCN-based recommendation problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08848173
Volume :
37
Issue :
9
Database :
Complementary Index
Journal :
International Journal of Intelligent Systems
Publication Type :
Academic Journal
Accession number :
158876433
Full Text :
https://doi.org/10.1002/int.22819