Environmental Research Letters (Jan 2023)

An attention-based LSTM model for long-term runoff forecasting and factor recognition

  • Dongyang Han,
  • Pan Liu,
  • Kang Xie,
  • He Li,
  • Qian Xia,
  • Qian Cheng,
  • Yibo Wang,
  • Zhikai Yang,
  • Yanjun Zhang,
  • Jun Xia

DOI
https://doi.org/10.1088/1748-9326/acaedd
Journal volume & issue
Vol. 18, no. 2
p. 024004

Abstract

Read online

With advances in artificial intelligence, machine learning-based models such as long short-term memory (LSTM) models have shown much promise in forecasting long-term runoff by mapping pathways between large-scale climate patterns and catchment runoff responses without considering physical processes. The recognition of key factors plays a vital role and thus affects the performance of the model. However, there is no conclusion on which recognition algorithm is the most suitable. To address this issue, an LSTM model combined with two attention mechanisms both in the input and hidden layers, namely AT-LSTM, is proposed for long-term runoff forecasting at Yichang and Pingshan stations in China. The added attention mechanisms automatically assign weights to 130 climate phenomenon indexes, avoiding the use of subjectively set recognition algorithms. Results show that the AT-LSTM model outperforms the Pearson’s correlation based LSTM model in terms of four evaluation metrics for monthly runoff forecasting. Further, the set indirect runoff prediction method verifies that the AT-LSTM model also performs effectively in precipitation and potential evapotranspiration forecasting, and the indirect runoff prediction is inferior to the AT-LSTM model to establish a direct link between climate factors and runoff. Finally, four key factors related to runoff are identified by the attention mechanism and their impacts on runoff are analyzed on intra- and inter-annual scales. The proposed AT-LSTM model can effectively improve the accuracy of long-term forecasting and identify the dynamic influence of input factors.

Keywords