Physical Review X (Oct 2018)

Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines

  • Eric W. Tramel,
  • Marylou Gabrié,
  • Andre Manoel,
  • Francesco Caltagirone,
  • Florent Krzakala

DOI
https://doi.org/10.1103/PhysRevX.8.041006
Journal volume & issue
Vol. 8, no. 4
p. 041006

Abstract

Read online Read online

Restricted Boltzmann machines (RBMs) are energy-based neural networks which are commonly used as the building blocks for deep-architecture neural architectures. In this work, we derive a deterministic framework for the training, evaluation, and use of RBMs based upon the Thouless-Anderson-Palmer (TAP) mean-field approximation of widely connected systems with weak interactions coming from spin-glass theory. While the TAP approach has been extensively studied for fully visible binary spin systems, our construction is generalized to latent-variable models, as well as to arbitrarily distributed real-valued spin systems with bounded support. In our numerical experiments, we demonstrate the effective deterministic training of our proposed models and are able to show interesting features of unsupervised learning which could not be directly observed with sampling. Additionally, we demonstrate how to utilize our TAP-based framework for leveraging trained RBMs as joint priors in denoising problems.