IEEE Access (Jan 2024)

Enhancing Multivariate Time Series Classifiers Through Self-Attention and Relative Positioning Infusion

  • Mehryar Abbasi,
  • Parvaneh Saeedi

DOI
https://doi.org/10.1109/ACCESS.2024.3397783
Journal volume & issue
Vol. 12
pp. 67273 – 67290

Abstract

Read online

Time Series Classification (TSC) is an important and challenging task for many visual computing applications. Despite the extensive range of methods developed for TSC, only a few are based on Deep Neural Networks (DNNs). In this paper, we present two novel attention blocks: (Global Temporal Attention and Temporal Pseudo-Gaussian Augmented Self-Attention) that can enhance deep learning-based TSC approaches, even when such approaches are designed and optimized for specific datasets or tasks. We validate the performance of the proposed blocks using multiple state-of-the-art deep learning-based TSC models on the University of East Anglia (UEA) benchmark, including a standardized collection of 30 Multivariate Time Series Classification (MTSC) datasets. We demonstrate that adding the proposed attention blocks increases base models’ average accuracy by up to 3.6%. Additionally, the proposed TPS block uses a new injection module to include the relative positional information in transformers. As a standalone unit with less computational complexity, it enables TPS to perform better than most of the state-of-the-art DNN-based TSC methods. The source codes for our setups and the attention blocks are publicly available (https://github.com/mehryar72/TimeSeriesClassification-TPS).

Keywords