Human-Centric Intelligent Systems (Jul 2023)

DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting

  • Ji Huang,
  • Minbo Ma,
  • Yongsheng Dai,
  • Jie Hu,
  • Shengdong Du

DOI
https://doi.org/10.1007/s44230-023-00037-z
Journal volume & issue
Vol. 3, no. 3
pp. 263 – 274

Abstract

Read online

Abstract The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time complexity of self-attention, it can limit information utilization. We introduce a creative double-branch attention mechanism that simultaneously captures intricate dependencies in both temporal and variable perspectives. Moreover, we propose query-independent attention, taking into account the near-identical attention allocated by self-attention to different query positions. This enhances efficiency and reduces the impact of redundant information. We integrate the double-branch query-independent attention into popular transformer-based methods like Informer, Autoformer, and Non-stationary transformer. The results obtained from conducting experiments on six practical benchmarks consistently validate that our novel attention mechanism substantially improves the long-term series forecasting performance in contrast to the baseline approach.

Keywords