Back to Search Start Over

Leave No One Behind: Online Self-Supervised Self-Distillation for Sequential Recommendation

Authors :
Wei, Shaowei
Wu, Zhengwei
Li, Xin
Wu, Qintong
Zhang, Zhiqiang
Zhou, Jun
Gu, Lihong
Gu, Jinjie
Publication Year :
2024

Abstract

Sequential recommendation methods play a pivotal role in modern recommendation systems. A key challenge lies in accurately modeling user preferences in the face of data sparsity. To tackle this challenge, recent methods leverage contrastive learning (CL) to derive self-supervision signals by maximizing the mutual information of two augmented views of the original user behavior sequence. Despite their effectiveness, CL-based methods encounter a limitation in fully exploiting self-supervision signals for users with limited behavior data, as users with extensive behaviors naturally offer more information. To address this problem, we introduce a novel learning paradigm, named Online Self-Supervised Self-distillation for Sequential Recommendation ($S^4$Rec), effectively bridging the gap between self-supervised learning and self-distillation methods. Specifically, we employ online clustering to proficiently group users by their distinct latent intents. Additionally, an adversarial learning strategy is utilized to ensure that the clustering procedure is not affected by the behavior length factor. Subsequently, we employ self-distillation to facilitate the transfer of knowledge from users with extensive behaviors (teachers) to users with limited behaviors (students). Experiments conducted on four real-world datasets validate the effectiveness of the proposed method.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.07219
Document Type :
Working Paper