IEEE Access (Jan 2022)

Umformer: A Transformer Dedicated to Univariate Multistep Prediction

  • Min Li,
  • Qinghui Chen,
  • Gang Li,
  • Delong Han

DOI
https://doi.org/10.1109/ACCESS.2022.3208139
Journal volume & issue
Vol. 10
pp. 101347 – 101361

Abstract

Read online

Univariate multi-step time series forecasting (UMTF) has many applications, such as the forecast of access traffic. The solution to the UMTF problem needs to efficiently capture key information in univariate data and improve the accuracy of multi-step forecasting. The advent of deep learning (DL) enables multi-level, high-performance prediction of complex multivariate inputs, but the solution and research of UMTF problems is extremely scarce. Existing methods cannot satisfy recent univariate forecasting tasks in terms of forecasting accuracy, efficiency, etc. This paper proposes a Transformer-based univariate multi-step forecasting model: Umformer. The contributions include: (1) To maximize the information obtained from a single variable, we propose a Prophet-based method for variable extraction, additionally considering some correlated variables for accurate predictions. (2) Gated linear units variants with three weight matrices (GLUV3) are designed, as a gating to improve the function of selective memory in long sequences, thereby obtaining more helpful information from a limited number of univariate variables and improving prediction accuracy. (3) Shared Double-heads Probsparse Attention (SDHPA) mechanism reduces memory footprint and improves attention-awareness. We combine the latest research results of current DL technology to achieve high-precision prediction in UMTF.Extensive experiments on public datasets from five different domains have shown that five metrics demonstrate that the Umformer approach is significantly better than existing methods. We offer a more efficient solution for UMTF.

Keywords