Journal of Hydroinformatics (Nov 2023)

Assessing the performances and transferability of graph neural network metamodels for water distribution systems

  • Bulat Kerimov,
  • Roberto Bentivoglio,
  • Alexander Garzón,
  • Elvin Isufi,
  • Franz Tscheikner-Gratl,
  • David Bernhard Steffelbauer,
  • Riccardo Taormina

DOI
https://doi.org/10.2166/hydro.2023.031
Journal volume & issue
Vol. 25, no. 6
pp. 2223 – 2234

Abstract

Read online

Metamodels accurately reproduce the output of physics-based hydraulic models with a significant reduction in simulation times. They are widely employed in water distribution system (WDS) analysis since they enable computationally expensive applications in the design, control, and optimisation of water networks. Recent machine-learning-based metamodels grant improved fidelity and speed; however, they are only applicable to the water network they were trained on. To address this issue, we investigate graph neural networks (GNNs) as metamodels for WDSs. GNNs leverage the networked structure of WDS by learning shared coefficients and thus offering the potential of transferability. This work evaluates the suitability of GNNs as metamodels for estimating nodal pressures in steady-state EPANET simulations. We first compare the effectiveness of GNN metamodels against multi-layer perceptrons (MLPs) on several benchmark WDSs. Then, we explore the transferability of GNNs by training them concurrently on multiple WDSs. For each configuration, we calculate model accuracy and speedups with respect to the original numerical model. GNNs perform similarly to MLPs in terms of accuracy and take longer to execute but may still provide substantial speedup. Our preliminary results indicate that GNNs can learn shared representations across networks, although assessing the feasibility of truly general metamodels requires further work. HIGHLIGHTS The accuracy of GNN-based and MLP-based metamodels is comparable on most of the studied water networks.; The proposed model can be trained on several water networks at once and can learn shared representation between them.; By learning shared representations, the model achieves comparable performance while requiring fewer training examples.; GNNs show promising results from transferability, although further study is required.;

Keywords