Back to Search Start Over

Disentangling Long and Short-Term Interests for Recommendation

Authors :
Zheng, Yu
Gao, Chen
Chang, Jianxin
Niu, Yanan
Song, Yang
Jin, Depeng
Li, Yong
Publication Year :
2022

Abstract

Modeling user's long-term and short-term interests is crucial for accurate recommendation. However, since there is no manually annotated label for user interests, existing approaches always follow the paradigm of entangling these two aspects, which may lead to inferior recommendation accuracy and interpretability. In this paper, to address it, we propose a Contrastive learning framework to disentangle Long and Short-term interests for Recommendation (CLSR) with self-supervision. Specifically, we first propose two separate encoders to independently capture user interests of different time scales. We then extract long-term and short-term interests proxies from the interaction sequences, which serve as pseudo labels for user interests. Then pairwise contrastive tasks are designed to supervise the similarity between interest representations and their corresponding interest proxies. Finally, since the importance of long-term and short-term interests is dynamically changing, we propose to adaptively aggregate them through an attention-based network for prediction. We conduct experiments on two large-scale real-world datasets for e-commerce and short-video recommendation. Empirical results show that our CLSR consistently outperforms all state-of-the-art models with significant improvements: GAUC is improved by over 0.01, and NDCG is improved by over 4%. Further counterfactual evaluations demonstrate that stronger disentanglement of long and short-term interests is successfully achieved by CLSR. The code and data are available at https://github.com/tsinghua-fib-lab/CLSR.<br />Comment: Accepted by WWW'22

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2202.13090
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3485447.3512098