Machine Learning: Science and Technology (Jan 2025)

Nearest-neighbors neural network architecture for efficient sampling of statistical physics models

  • Luca Maria Del Bono,
  • Federico Ricci-Tersenghi,
  • Francesco Zamponi

DOI
https://doi.org/10.1088/2632-2153/adcdc1
Journal volume & issue
Vol. 6, no. 2
p. 025029

Abstract

Read online

The task of sampling efficiently the Gibbs–Boltzmann distribution of disordered systems is important both for the theoretical understanding of these models and for the solution of practical optimization problems. Unfortunately, this task is known to be hard, especially for spin-glass-like problems at low temperatures. Recently, many attempts have been made to tackle the problem by mixing classical Monte Carlo schemes with newly devised neural networks that learn to propose smart moves. In this article, we introduce the nearest-neighbors neural network ( $\text{4N}$ ) architecture, a physically interpretable deep architecture whose number of parameters scales linearly with the size of the system and that can be applied to a large variety of topologies. We show that the $\text{4N}$ architecture can accurately learn the Gibbs–Boltzmann distribution for a prototypical spin-glass model, the two-dimensional Edwards–Anderson model, and specifically for some of its most difficult instances. In particular, it captures properties such as the energy, the correlation function and the overlap probability distribution. Finally, we show that the $\text{4N}$ performance increases with the number of layers, in a way that clearly connects to the correlation length of the system, thus providing a simple and interpretable criterion to choose the optimal depth.

Keywords