IEEE Access (Jan 2021)

KeyMemoryRNN: A Flexible Prediction Framework for Spatiotemporal Prediction Networks

  • Shengchun Wang,
  • Xiang Lin,
  • Huijie Zhu

DOI
https://doi.org/10.1109/ACCESS.2021.3114215
Journal volume & issue
Vol. 9
pp. 147678 – 147691

Abstract

Read online

Most previous recurrent neural networks for spatiotemporal prediction have difficulty in learning the long-term spatiotemporal correlations and capturing skip-frame correlations. The reason is that the recurrent neural networks update the memory states only using information from the previous time step node and the networks tend to suffer from gradient propagation difficulties. We propose a new framework, KeyMemoryRNN, which has two contributions. The first is that we propose the KeyTranslate Module to extract the most effective historical memory state named keyword state, and we propose the KeyMemory-LSTM which uses the keyword state to update the hidden state to capture the skip-frame correlation. In particular, KeyMemoryLSTM has two training stages. In the second stage, KeyMemoryLSTM adaptively skips the update of sometime step nodes to build a shorter memory information flow to alleviate the difficulty of gradient propagation to learn the long-term spatiotemporal correlations. The second is that both KeyTranslate Module and KeyMemoryLSTM are flexible additional modules, so we can apply them to most RNN-based prediction networks to build KeyMemoryRNN with different base network. The KeyMemoryRNN achieves the state-of-the-art on three spatiotemporal prediction tasks, and we provide ablation studies and memory analysis to verify the effectiveness of KeyMemoryRNN.

Keywords