Scientific Reports (Mar 2021)
A cautionary tale for machine learning generated configurations in presence of a conserved quantity
Abstract
Abstract We investigate the performance of machine learning algorithms trained exclusively with configurations obtained from importance sampling Monte Carlo simulations of the two-dimensional Ising model with conserved magnetization. For supervised machine learning, we use convolutional neural networks and find that the corresponding output not only allows to locate the phase transition point with high precision, it also displays a finite-size scaling characterized by an Ising critical exponent. For unsupervised learning, restricted Boltzmann machines (RBM) are trained to generate new configurations that are then used to compute various quantities. We find that RBM generates configurations with magnetizations and energies forbidden in the original physical system. The RBM generated configurations result in energy density probability distributions with incorrect weights as well as in wrong spatial correlations. We show that shortcomings are also encountered when training RBM with configurations obtained from the non-conserved Ising model.