IEEE Access (Jan 2024)

Air Quality Prediction Based on Time Series Decomposition and Convolutional Sparse Self-Attention Mechanism Transformer Model

  • Wenyi Cao,
  • Weiwei Qi,
  • Peiqi Lu

DOI
https://doi.org/10.1109/ACCESS.2024.3484579
Journal volume & issue
Vol. 12
pp. 155340 – 155350

Abstract

Read online

This study introduces an innovative air quality prediction model, TD-CS-Transformer, which fuses time series decomposition and convolutional sparse self-attention Transformer model. It solves inefficient and limited long-distance dependency capture of traditional models in long sequence data. The model simplifies the structure and reduces the amount of computation by decomposing the series into components. Convolutional sparse self-attention enhances long-distance dependent capture and improves accuracy. Experiments on public datasets show that the TD-CS-Transformer outperforms existing methods in PM2.5 and PM10 concentration prediction and has the characteristics of high accuracy, low error, fast training, and minimal memory footprint, showing strong practicality and scalability. In the study of air quality prediction based on time series decomposition and convolutional sparse self-attention mechanism Transformer model, we use this model to conduct an in-depth analysis of air quality data containing more than 180,000 records in the last 5 years. The time series decomposition technique decomposes the complex time series data into trend, season, period and irregular components, which simplifies the data pattern. Subsequently, the Transformer model with convolutional sparse self-attention mechanism is used for processing, which significantly improves the model’s ability to capture long-distance dependencies and reduces the computational complexity through sparse connections, enabling the model to maintain high-accuracy predictions.

Keywords