Applied Sciences (Apr 2024)

Weight Adjustment Framework for Self-Attention Sequential Recommendation

  • Zheng-Ang Su,
  • Juan Zhang

DOI
https://doi.org/10.3390/app14093608
Journal volume & issue
Vol. 14, no. 9
p. 3608

Abstract

Read online

In recent years, sequential recommendation systems have become a hot topic in the field of recommendation system research. These systems predict future user actions or preferences by analyzing their historical interaction sequences, such as browsing history and purchase records, and then recommend items that users may be interested in. Among various sequential recommendation algorithms, those based on the Transformer model have become a focus of research due to their powerful self-attention mechanisms. However, one of the main challenges faced by sequential recommendation systems is the noise present in the input data, such as erroneous clicks and incidental browsing. This noise can disrupt the model’s accurate allocation of attention weights, thereby affecting the accuracy and personalization of the recommendation results. To address this issue, we propose a novel method named “weight adjustment framework for self-attention sequential recommendation” (WAF-SR). WAF-SR mitigates the negative impact of noise on the accuracy of the attention layer weight distribution by improving the quality of the input data. Furthermore, WAF-SR enhances the model’s understanding of user behavior by simulating the uncertainty of user preferences, allowing for a more precise distribution of attention weights during the training process. Finally, a series of experiments demonstrate the effectiveness of the WAF-SR in enhancing the performance of sequential recommendation systems.

Keywords