Back to Search Start Over

Weight Adjustment Framework for Self-Attention Sequential Recommendation.

Authors :
Su, Zheng-Ang
Zhang, Juan
Source :
Applied Sciences (2076-3417); May2024, Vol. 14 Issue 9, p3608, 14p
Publication Year :
2024

Abstract

In recent years, sequential recommendation systems have become a hot topic in the field of recommendation system research. These systems predict future user actions or preferences by analyzing their historical interaction sequences, such as browsing history and purchase records, and then recommend items that users may be interested in. Among various sequential recommendation algorithms, those based on the Transformer model have become a focus of research due to their powerful self-attention mechanisms. However, one of the main challenges faced by sequential recommendation systems is the noise present in the input data, such as erroneous clicks and incidental browsing. This noise can disrupt the model's accurate allocation of attention weights, thereby affecting the accuracy and personalization of the recommendation results. To address this issue, we propose a novel method named "weight adjustment framework for self-attention sequential recommendation" (WAF-SR). WAF-SR mitigates the negative impact of noise on the accuracy of the attention layer weight distribution by improving the quality of the input data. Furthermore, WAF-SR enhances the model's understanding of user behavior by simulating the uncertainty of user preferences, allowing for a more precise distribution of attention weights during the training process. Finally, a series of experiments demonstrate the effectiveness of the WAF-SR in enhancing the performance of sequential recommendation systems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20763417
Volume :
14
Issue :
9
Database :
Complementary Index
Journal :
Applied Sciences (2076-3417)
Publication Type :
Academic Journal
Accession number :
177181411
Full Text :
https://doi.org/10.3390/app14093608