IEEE Access (Jan 2024)
Multi-Scale Transformer Pyramid Networks for Multivariate Time Series Forecasting
Abstract
Multivariate Time Series (MTS) forecasting entails the intricate process of modeling temporal dependencies within historical data records. Transformers have demonstrated remarkable performance in MTS forecasting due to their capability to capture long-term dependencies. However, prior work has been confined to modeling temporal dependencies at either a fixed scale or multiple scales that exponentially increase (most with base 2). This limitation impedes their capacity to effectively capture diverse seasonalities. In our study, we present a dimension-invariant embedding technique designed to capture short-term temporal dependencies. This procedure projects MTS data into a higher-dimensional space while preserving the original time steps and variable dimensions. Furthermore, we present a novel Multi-scale Transformer Pyramid Network (MTPNet), specifically designed to capture temporal dependencies at multiple unconstrained scales effectively. The predictions are inferred from multi-scale latent representations obtained from transformers at various scales. Extensive experiments on nine benchmark datasets demonstrate that the proposed MTPNet outperforms recent state-of-the-art methods. This enhancement in performance is particularly pronounced in datasets rich in fine-scale information, as it enables MTPNet to effectively capture a wide spectrum of temporal dependencies, ranging from fine to coarse scales. This finding highlights MTPNet’s notable potential in analyzing MTS data sampled at the minute level. Code is available at github.com/MTPNet.
Keywords