IEEE Access (Jan 2020)

A Sparse Connected Long Short-Term Memory With Sharing Weight for Time Series Prediction

  • Liyan Xiong,
  • Xiangzheng Ling,
  • Xiaohui Huang,
  • Hong Tang,
  • Weimin Yuan,
  • Weichun Huang

DOI
https://doi.org/10.1109/ACCESS.2020.2984796
Journal volume & issue
Vol. 8
pp. 66856 – 66866

Abstract

Read online

The development of the mobile Internet and the success of deep learning in many applications have driven the need to deploy and apply deep learning models on mobile devices under the condition of limited resources. Long Short-Term Memory (LSTM), as a special scheme in deep learning, can learn long-distance dependencies hidden in time series. However, the high computational complexity of LSTM-related structures and the need for a large number of resources for training have become obstacles to their deployment on mobile devices. In order to reduce the resource requirements and computational costs of LSTMs, we use pruning strategies to preserve important connections during the training phase. After training, we reduce the complexity of LSTMs network by sharing weight strategy.Based on these strategies, we propose a sparse connected LSTM with a sharing weight (SCLSTM) model. The experimental results on the real data sets show that SCLSTM with 0.88% neural connections can obtain prediction capabilities comparable to densely connected LSTM. Moreover, SCLSTM can solve the problem of overfitting to some extent. The results of experiments demonstrate that SCLSTM can perform better than the-state-of-arts algorithm on mobile devices of limited resources.

Keywords