Back to Search Start Over

Deep multi-graph neural networks with attention fusion for recommendation.

Authors :
Song, Yuzhi
Ye, Hailiang
Li, Ming
Cao, Feilong
Source :
Expert Systems with Applications. Apr2022, Vol. 191, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

Graph neural networks (GNNs), with their promising potential to learn effective graph representation, have been widely used for recommender systems, in which the given graph data contains abundant users, items, and their historical interaction information. How to obtain preferable latent representations for both users and items is one of the key issues for GNN-based recommendation. This paper develops a novel deep GNN model with multi-graph attention fusion, MAF-GNN. This framework constructs two feature graph attention modules and a multi-scale latent features module, to generate better user and item latent features from input information. Specifically, the dual-branch residual graph attention (DBRGA) module is presented to extract neighbors' similar features from user and item graphs effectively and easily. Then multi-scale latent matrices are captured by applying non-linear transformations which are embedded to reduce the cost of dimension selection. Furthermore, a hybrid fusion graph attention (HFGA) module is designed to obtain valuable collaborative information from the user–item interaction graph, aiming to further refine the latent embedding of users and items. Finally, the whole MAF-GNN framework is optimized by a geometric factorized regularization loss. Extensive experiment results on both synthetic and real-world datasets illustrate that MAF-GNN can achieve better recommendation performance with a certain level of interpretability than some existing approaches. • A novel deep GNN model with multi-graph attention fusion mechanism is proposed. • A dual-branch residual graph attention module is developed. • A hybrid fusion graph attention module is designed to obtain valuable information. • Multi-scale latent matrices are designed to reduce significantly the time cost. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
191
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
154661779
Full Text :
https://doi.org/10.1016/j.eswa.2021.116240