1. MARec: A multi-attention aware paper recommendation method.
- Author
-
Wang, Jie, Zhou, Jingya, Wu, Zhen, and Sun, Xigang
- Subjects
- *
MULTICASTING (Computer networks) , *INFORMATION networks , *RECOMMENDER systems , *FEATURE extraction - Abstract
The paper recommender system has provided great convenience for people to obtain knowledge of what they are interested in. Existing recommendation works mainly rely on learning entities' feature representations (network embedding) from the Heterogeneous Information Networks (HINs) through meta-paths. Most of them suffer from the difficulty of designing and selecting meta-paths in HIN. Moreover, changes in papers' captive audiences and users' reading favor in the recent trends are always covered up by long-term trends. To this end, we introduce a multi-attention aware recommendation method, named MARec. Specifically, to efficiently design and select meta-paths, we propose an auxiliary node-driven HIN construction method and combine it with the graph attention network (GAT) to learn the entities' feature representations. To endow the recommendation results with timeliness, we design an attention-aware Bi-directional LSTM (Bi-LSTM) and a compensation mechanism to incorporate the recent trend changes. Furthermore, to take advantage of the importance of different papers/users, we utilize an attention mechanism to capture their importance by adaptively fusing the interactive historical data of those paper/user entities. We use extensive experiments on three real-world datasets to show that our proposed MARec outperforms current representative baselines by 11.49%, 11.40%, and 13.37% on average on three main metrics. • A general heterogeneous information network towards simplifying meta-path design. • Embedding with attention mechanism avoids painstakingly selecting meta-paths. • Feature extraction under multi-time scale greatly improves the recommended results. • MARec demonstrates superior performance on different real-world domain datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF