Applied Sciences (Sep 2024)

Enhancing Explainable Recommendations: Integrating Reason Generation and Rating Prediction through Multi-Task Learning

  • Xingyu Zhu,
  • Xiaona Xia,
  • Yuheng Wu,
  • Wenxu Zhao

DOI
https://doi.org/10.3390/app14188303
Journal volume & issue
Vol. 14, no. 18
p. 8303

Abstract

Read online

In recent years, recommender systems—which provide personalized recommendations by analyzing users’ historical behavior to infer their preferences—have become essential tools across various domains, including e-commerce, streaming media, and social platforms. Recommender systems play a crucial role in enhancing user experience by mining vast amounts of data to identify what is most relevant to users. Among these, deep learning-based recommender systems have demonstrated exceptional recommendation performance. However, these “black-box” systems lack reasonable explanations for their recommendation results, which reduces their impact and credibility. To address this situation, an effective strategy is to provide a personalized textual explanation along with the recommendation. This approach has received increasing attention from researchers because it can enhance users’ trust in recommender systems through intuitive explanations. In this context, our paper introduces a novel explainable recommendation model named GCLTE. This model integrates Graph Contrastive Learning with transformers within an Encoder–Decoder framework to perform rating prediction and reason generation simultaneously. In addition, we cleverly combine the neural network layer with the transformer using a straightforward information enhancement operation. Finally, our extensive experiments on three real-world datasets demonstrate the effectiveness of GCLTE in both recommendation and explanation. The experimental results show that our model outperforms the top existing models.

Keywords