IEEE Access (Jan 2022)

Robust LSTM With Tuned-PSO and Bifold-Attention Mechanism for Analyzing Multivariate Time-Series

  • Andri Pranolo,
  • Yingchi Mao,
  • Aji Prasetya Wibawa,
  • Agung Bella Putra Utama,
  • Felix Andika Dwiyanto

DOI
https://doi.org/10.1109/ACCESS.2022.3193643
Journal volume & issue
Vol. 10
pp. 78423 – 78434

Abstract

Read online

The need for accurate time-series results is badly demanding. LSTM has been applied for forecasting time series, which is generated when variables are observed at discrete and equal time intervals. Nevertheless, the problem of determining hyperparameters with a relatively high random rate will reduce the accuracy of the prediction results. This paper aims to promote LSTM with tuned-PSO and Bifold-Attention mechanism. PSO optimizes LSTM hyperparameters, and Bifold-attention mechanism selects the optimal input for LSTM. An accurate, adaptive, and robust time-series forecasting model is the main contribution, compared with ARIMA, MLP, LSTM, PSO-LSTM, A-LSTM, and PSO-A-LSTM. The model comparison is based on the accuracy of each model in forecasting Beijing PM2.5, Beijing Multi-Site, Air Quality, Appliances Energy, Wind Speed, and Traffic Flow. The Proposed model, LSTM with tuned-PSO and Bifold-Attention mechanism, has lower MAPE and RMSE than baselines. In other words, the model outperformed all LSTM base models in this study. The proposed model’s accuracy is adaptable in daily, weekly, and monthly multivariate time-series datasets. This ground-breaking innovation is valuable for time-series analysis research, particularly the implementation of deep learning for time-series forecasting.

Keywords