Back to Search Start Over

基于自注意力机制的局部与全局特征融合的 评分预测算法.

Authors :
伊 磊
纪淑娟
Source :
Application Research of Computers / Jisuanji Yingyong Yanjiu. May2022, Vol. 39 Issue 5, p1337-1342. 6p.
Publication Year :
2022

Abstract

In order to fully mine nodes' features and better integrate these features simultaneously in the heterogeneous information network, this paper proposed a AMF L& GRec Firstly, AMFL & GRec used the Leader Rank algorithm to extract the target node' global sequence, and used a meta-path-bas ed heterogeneous information network embedding model to extract the node' local sequence,and used the skip-gram model to learn the node' global and local features. And then it used the self-attention mechanism to learn the preference of the target nodes' local and global features to obtain the feature representation of the target node in a single meta-path. Secondly, it used the self-attention mechanism to fuse the representation of the same node under different meta-paths to obtain the final feature representation. Finally, it utilized a multi-layer perceptron to achieve the task of rating prediction. This paper conducted a large number of experiments on two real datasets. The experimental results verify that the AMFL & GRec algorithm can not only capture the micro(local) structure of densely connected nodes, but also capture the global structure of the node in the network, and finally obtain nodes' overall (local +global) characteristics. At the same time, the experimental results also prove that the AMFL & GRec's rating prediction performance is better than the baselines. It proves that in the heterogeneous information network utilizing the self-attention mechanism to consider the nodes' preferences for local and global features and meta-paths can improve the accuracy of rating prediction. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
10013695
Volume :
39
Issue :
5
Database :
Academic Search Index
Journal :
Application Research of Computers / Jisuanji Yingyong Yanjiu
Publication Type :
Academic Journal
Accession number :
156813568
Full Text :
https://doi.org/10.19734/j.issn.1001-3695.2021.10.0446