IET Generation, Transmission & Distribution (Apr 2023)

A deep LSTM‐CNN based on self‐attention mechanism with input data reduction for short‐term load forecasting

  • Shiyan Yi,
  • Haichun Liu,
  • Tao Chen,
  • Jianwen Zhang,
  • Yibo Fan

DOI
https://doi.org/10.1049/gtd2.12763
Journal volume & issue
Vol. 17, no. 7
pp. 1538 – 1552

Abstract

Read online

Abstract Numerous studies on short‐term load forecasting (STLF) have used feature extraction methods to increase the model's accuracy by incorporating multidimensional features containing time, weather and distance information. However, less attention has been paid to the input data size and output dimensions in STLF. To address these two issues, an STLF model is proposed based on output dimensions using only load data. First, the load data's long‐term behavior (trend and seasonality) is extracted through the long short‐term memory network (LSTM), followed by convolution to obtain the load data's non‐stationarity. Then, using the self‐attention mechanism (SAM), the crucial input load information is emphasized in the forecasting process. The calculation example shows that the proposed algorithm outperforms LSTM, LSTM‐based SAM, and CNN‐GRU‐based SAM by more than 10% in eight different buildings, demonstrating its suitability for forecasting with only load data. Additionally, compared to earlier research utilizing two well‐known public data sets, the MAPE is optimized by 2.2% and 5%, respectively. Also, the method has good prediction accuracy for a wide variety of time granularities and load aggregation levels, so it can be applied to various load forecasting scenarios and has good reference significance for load forecasting instrumentation.

Keywords