Back to Search Start Over

Rethinking Lifelong Sequential Recommendation with Incremental Multi-Interest Attention

Authors :
Wu, Yongji
Yin, Lu
Lian, Defu
Yin, Mingyang
Gong, Neil Zhenqiang
Zhou, Jingren
Yang, Hongxia
Wu, Yongji
Yin, Lu
Lian, Defu
Yin, Mingyang
Gong, Neil Zhenqiang
Zhou, Jingren
Yang, Hongxia
Publication Year :
2021

Abstract

Sequential recommendation plays an increasingly important role in many e-commerce services such as display advertisement and online shopping. With the rapid development of these services in the last two decades, users have accumulated a massive amount of behavior data. Richer sequential behavior data has been proven to be of great value for sequential recommendation. However, traditional sequential models fail to handle users' lifelong sequences, as their linear computational and storage cost prohibits them from performing online inference. Recently, lifelong sequential modeling methods that borrow the idea of memory networks from NLP are proposed to address this issue. However, the RNN-based memory networks built upon intrinsically suffer from the inability to capture long-term dependencies and may instead be overwhelmed by the noise on extremely long behavior sequences. In addition, as the user's behavior sequence gets longer, more interests would be demonstrated in it. It is therefore crucial to model and capture the diverse interests of users. In order to tackle these issues, we propose a novel lifelong incremental multi-interest self attention based sequential recommendation model, namely LimaRec. Our proposed method benefits from the carefully designed self-attention to identify relevant information from users' behavior sequences with different interests. It is still able to incrementally update users' representations for online inference, similarly to memory network based approaches. We extensively evaluate our method on four real-world datasets and demonstrate its superior performances compared to the state-of-the-art baselines.<br />Comment: 11 pages

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1269553302
Document Type :
Electronic Resource