IEEE Access (Jan 2020)

Fast Trajectory Prediction Method With Attention Enhanced SRU

  • Yadong Li,
  • Bailong Liu,
  • Lei Zhang,
  • Susong Yang,
  • Changxing Shao,
  • Dan Son

DOI
https://doi.org/10.1109/ACCESS.2020.3035704
Journal volume & issue
Vol. 8
pp. 206614 – 206621

Abstract

Read online

LSTM (Long-short Term Memory) is an effective method for trajectory prediction. However, it needs to rely on the state value of the previous unit when calculating the state value of neurons in the hidden layer, which results in too long training time and prediction time. To solve this problem, we propose Fast Trajectory Prediction method with Attention enhanced SRU (FTP-AS). Firstly, we devise an SRU (Simple Recurrent Units) based trajectory prediction method. It removes the dependencies on the hidden layer state at the previous moment, and enables the model to perform better parallel calculation, speeding up model training and prediction. However, each unit of the SRU calculates the state value at each moment independently, ignoring the timing relationship between the track points and leading to accuracy decrease. Secondly, we develop the attention mechanism to enhance SRU. The influence weight for selective learning is gained by calculating the matching degree of the hidden layer state value at each moment to improve the accuracy of the prediction. Finally, experimental results on MTA bus data set and Porto taxi data set showed that FTP-AS was 3.4 times faster and about 1.7% more accurate than the traditional LSTM method.

Keywords