IEEE Access (Jan 2024)

EnPredRNN: An Enhanced PredRNN Network for Extending Spatio-Temporal Prediction Period

  • Dali Wu,
  • Jiayang Kong,
  • Zhicheng Li,
  • Guojing Zhang,
  • Huaicong Zhang,
  • Jing Liang,
  • Xing Zhang

DOI
https://doi.org/10.1109/ACCESS.2024.3438992
Journal volume & issue
Vol. 12
pp. 107631 – 107644

Abstract

Read online

We propose the Enhanced Predictive Recurrent Neural Network (EnPredRNN) based on PredRNN to extend the period of spatio-temporal prediction. To better capture global spatial dependencies, we integrate the Self-Attention (SA) module into PredRNN’s basic unit, forming the Enhanced Long Short-Term Memory (EnLSTM). Aiming at the problem that the standard temporal memory state in PredRNN contains insufficient information about inter-frame motion, we propose the Enhanced Temporal Memory (ETM) by aggregating past multi-step temporal memory states. Aiming at the gradient vanishing problem in Recurrent Neural Networks (RNN), the Alleviating Gradient Vanishing (AGV) structure is used to construct the high-speed path that facilitates gradient propagation. Experimental results show that EnPredRNN effectively extends spatio-temporal prediction from ten to thirty time-steps.

Keywords