Mathematics (Jul 2024)

GAT4Rec: Sequential Recommendation with a Gated Recurrent Unit and Transformers

  • Huaiwen He,
  • Xiangdong Yang,
  • Feng Huang,
  • Feng Yi,
  • Shangsong Liang

DOI
https://doi.org/10.3390/math12142189
Journal volume & issue
Vol. 12, no. 14
p. 2189

Abstract

Read online

Capturing long-term dependency from historical behaviors is the key to the success of sequential recommendation; however, existing methods focus on extracting global sequential information while neglecting to obtain deep representations from subsequences. Previous research has revealed that the restricted inter-item transfer is fundamental to sequential modeling, and some potential substructures of sequences can help models learn more effective long-term dependency compared to the whole sequence. To automatically find better subsequences and perform efficient learning, we propose a sequential recommendation model with a gated recurrent unit and Transformers, abbreviated as GAT4Rec, which employs Transformers with shared parameters across layers to model users’ historical interaction sequences. The representation learned by the gated recurrent unit is used as the gating signal to identify the optimal substructure in user sequences. The fused representation of the subsequence and edge information is extracted by the encoding layer to make the corresponding recommendations. Experimental results on four well-known publicly available datasets demonstrate that our GAT4Rec model outperforms other recommendation models, achieving performance improvements of 5.77%, 1.35%, 11.58%, and 1.79% in the normalized discounted cumulative gain metric (NDCG@10), respectively.

Keywords