Applied Sciences (Feb 2019)

A Deep Temporal Neural Music Recommendation Model Utilizing Music and User Metadata

  • Hai-Tao Zheng,
  • Jin-Yuan Chen,
  • Nan Liang,
  • Arun Kumar Sangaiah,
  • Yong Jiang,
  • Cong-Zhi Zhao

DOI
https://doi.org/10.3390/app9040703
Journal volume & issue
Vol. 9, no. 4
p. 703

Abstract

Read online

Deep learning shows its superiority in many domains such as computing vision, nature language processing, and speech recognition. In music recommendation, most deep learning-based methods focus on learning users’ temporal preferences using their listening histories. The cold start problem is not addressed, however, and the music characteristics are not fully exploited by these methods. In addition, the music characteristics and the users’ temporal preferences are not combined naturally, which cause the relatively low performance of music recommendation. To address these issues, we proposed a Deep Temporal Neural Music Recommendation model (DTNMR) based on music characteristics and the users’ temporal preferences. We encoded the music metadata into one-hot vectors and utilized the Deep Neural Network to project the music vectors to low-dimensional space and obtain the music characteristics. In addition, Long Short-Term Memory (LSTM) neural networks are utilized to learn about users’ long-term and short-term preferences from their listening histories. DTNMR alleviates the cold start problem in the item side using the music medadata and discovers new users’ preferences immediately after they listen to music. The experimental results show DTNMR outperforms seven baseline methods in terms of recall, precision, f-measure, MAP, user coverage and AUC.

Keywords