IEEE Access (Jan 2024)
A Comparative Multivariate Analysis of VAR and Deep Learning-Based Models for Forecasting Volatile Time Series Data
Abstract
The existing literature on forecasting time series data is primarily based on univariate analysis and techniques such as Univariate Autoregressive (UAR), Univariate Moving Average (UMA), Simple Exponential Smoothing (SES), deep learning models, and, most notably, univariate Long Short-Term Memory (LSTM) built based on univariate variable where the next lag of time series is leveraged for forecasting the next cycle of data. This paper takes this line of research to the next level by focusing on forecasting time series data based on “multivariate” modeling and analysis. To have a better insight of the performance of various deep learning-based models when multivariate analysis is performed, the paper builds and reports the forecasting accuracy for techniques such as the Transformer-based Multi-head Attention network, Long Short-Term Memory (LSTM), Bidirectional Long Short-Term Memory (BI-LSTM), Temporal Convolution Network (TCN), and conventional Vector Autoregressive (VAR) models. The findings revealed that the TCN model achieved the average lowest RMSE values of 0.0589 for stock data and 0.1554 for cryptocurrency data. Notably, the Multi-Head Attention model achieved average $R^{2}$ values of 0.92 for stock data and −1.98 for cryptocurrency data with respect to five variables (i.e., open, high, low, close and volume). According to the empirical studies conducted and reported in this paper, the transformer-based Multi-head Attention network outperformed other models such as LSTM, BI-LSTM, and more importantly conventional Vector Auto-Regression Models (VAR) in stocks and cryptocurrencies time series data where several variables were leveraged in building these multivariate-based models.
Keywords