IEEE Access (Jan 2023)

Long-Term Trajectory Prediction Model Based on Transformer

  • Qiang Tong,
  • Jinqing Hu,
  • Yuli Chen,
  • Dongdong Guo,
  • Xiulei Liu

DOI
https://doi.org/10.1109/ACCESS.2023.3343800
Journal volume & issue
Vol. 11
pp. 143695 – 143703

Abstract

Read online

Recurrent neural network models have problems such as memory loss and gradient disappearance when dealing with long time series data. This paper proposes a long-term trajectory prediction model based on Transformer to process long-term sequence information. Firstly, the position encoding is used to preserve the relative positional relationship between trajectory points. Secondly, the multi-head attention mechanism is used to fully learn the feature information between different trajectories, and the trajectory data can be encoded at one time. Finally, the encoder and decoder mechanism is used to predict future trajectory data. Compared with the long-term trajectory prediction benchmark method TrajAirNet, the average displacement error, absolute displacement error of the proposed model on the long-term trajectory dataset are reduced by about 8.2% and 51.4%, respectively. The experimental results show that the proposed model has higher accuracy and robustness on long-term trajectory prediction dataset.

Keywords