IEEE Access (Jan 2025)

Local-Global and Multi-Scale (LG-MS) Mixer Architecture for Long-Term Time Series Forecasting

  • Zhennan Peng,
  • Boyong Gao,
  • Ziqi Xia,
  • Jie Liu

DOI
https://doi.org/10.1109/ACCESS.2024.3524499
Journal volume & issue
Vol. 13
pp. 9199 – 9208

Abstract

Read online

Although deep learning models dominate time series forecasting, they still struggle with long-sequence processing due to the challenges of extracting dynamic fluctuations and pattern features as input length increases. To address this challenge, we propose a framework – LG-MSMixer—to enhance long-term time series forecasting through three key steps: multi-scale dual decomposition, local-global information extraction, and fusion prediction. Specifically, we first conduct multi-scale dual decomposition of the long input sequence to derive a seasonal-trend component combination. To capture a more comprehensive effective information within the components, we then utilize a customized patch-based triple attention local-global information extractor that models both temporal feature information and variable dependencies, alongside an MLP-based feature interaction iterator facilitating interactions among multi-scale information to guide macro-level predictions. Finally, we integrate the predictions from the multi-scale sequences to leverage their complementary advantages. In our experiments, we demonstrate the effectiveness of LG-MSMixer across various real-world long-term forecasting tasks, significantly outperforming previous baselines.

Keywords