Journal of Hydroinformatics (Nov 2023)

A new stable and interpretable flood forecasting model combining multi-head attention mechanism and multiple linear regression

  • Yi-yang Wang,
  • Wenchuan Wang,
  • Kwok-wing Chau,
  • Dong-mei Xu,
  • Hong-fei Zang,
  • Chang-jun Liu,
  • Qiang Ma

DOI
https://doi.org/10.2166/hydro.2023.160
Journal volume & issue
Vol. 25, no. 6
pp. 2561 – 2588

Abstract

Read online

This article proposes a multi-head attention flood forecasting model (MHAFFM) that combines a multi-head attention mechanism (MHAM) with multiple linear regression for flood forecasting. Compared to models based on Long Short-Term Memory (LSTM) neural networks, MHAFFM enables precise and stable multi-hour flood forecasting. First, the model utilizes characteristics of full-batch stable input data in multiple linear regression to solve the problem of oscillation in the prediction results of existing models. Second, full-batch information is connected to MHAM to improve the model's ability to process and interpret high-dimensional information. Finally, the model accurately and stably predicts future flood processes through linear layers. The model is applied to Dawen River Basin, and experimental results show that the MHAFFM, compared to three benchmarking models, namely, LSTM, BOA-LSTM (LSTM with Bayesian Optimization Algorithm for Hyperparameter Tuning), and MHAM-LSTM (LSTM model with MHAM in hidden layer), significantly improves the prediction performance under different lead time scenarios while maintaining good stability and interpretability. Taking Nash–Sutcliffe efficiency index as an example, under a lead time of 3 h, the MHAFFM model exhibits improvements of 8.85, 3.71, and 10.29% compared to the three benchmarking models, respectively. This research provides a new approach for flood forecasting. HIGHLIGHTS Proposes a novel multi-head attention flood forecasting model (MHAFFM).; Multi-head attention mechanism strengthens the model's ability to handle high-dimensional data.; Linear layers effectively harness the performance of the multi-head attention mechanism.; MHAFFM significantly enhances the stability of forecasted results.; Even in longer lead time scenarios, the model maintains high accuracy and stability.;

Keywords