IEEE Access (Jan 2024)

Online Attention Enhanced Differential and Decomposed LSTM for Time Series Prediction

  • Lina Li,
  • Shengkui Huang,
  • Guoxing Liu,
  • Cheng Luo,
  • Qinghe Yu,
  • Nianfeng Li

DOI
https://doi.org/10.1109/ACCESS.2024.3395651
Journal volume & issue
Vol. 12
pp. 62416 – 62428

Abstract

Read online

Due to the time variability and bursty of data, accurate and lag-free time series prediction is difficult and challenging. To address these problems, we propose an online attention enhanced differential and decomposed LSTM (Long Short Term Memory) model called OADDL, which can better capture the comprehensive core features and important structures of time series. In this model, the core features of the time series are first generated through differential and decomposition methods to reduce data complexity and remove noisy data. Then, the self-attention module and LSTM capture the full time core features and important structures of time series. Finally, FCN (Fully Connected Network) fuses the omnidirectional features of time series. Meanwhile, we design an online two-stage training mode for this model, in which attention enhanced LSTM and FCN models are sequentially trained, and the training set and model hyper-parameters are continuously updated over time, thus further capturing the time-varying and burst characteristics of time series. We conduct tests on three typical datasets, and the experimental results show that compared with latest typical deep learning models, OADDL can more accurately predict time series data and effectively alleviate the problem of prediction lag.

Keywords