IEEE Access (Jan 2019)

A Session-Based Customer Preference Learning Method by Using the Gated Recurrent Units With Attention Function

  • Jenhui Chen,
  • Ashu Abdul

DOI
https://doi.org/10.1109/ACCESS.2019.2895647
Journal volume & issue
Vol. 7
pp. 17750 – 17759

Abstract

Read online

In this paper, we investigate an attention function combined with the gated recurrent units (GRUs), named GRUA, to raise the accuracy of the customer preference prediction. The attention function extracts the important product features by using the time-bias parameter and the term frequency-inverse document frequency parameter for recommending products to a customer in the ongoing session. We show that the attention function with the GRUs can learn the customer's intention in the ongoing session more precisely than the existing session-based recommendation (SBR) methods. The experimental results show that the GRUA outperforms two SBR methods: the stacked denoising autoencoders with collaborative filtering (SDAE/CF) and the GRUs with collaborative filtering (GRU/CF) based on the precision and recall evaluation metrics. The data from three publicly available datasets, the Amazon Product Review dataset, the Xing dataset, and the Yoo-Choose Click dataset, are used to evaluate the performance of the GRUA with the SDAE/CF and the GRU/CF. This paper shows that adopting the attention function into the GRUs can dramatically increase the accuracy of the product recommendation in the SBR.

Keywords