Entropy (Nov 2021)

Sampling the Variational Posterior with Local Refinement

  • Marton Havasi,
  • Jasper Snoek,
  • Dustin Tran,
  • Jonathan Gordon,
  • José Miguel Hernández-Lobato

DOI
https://doi.org/10.3390/e23111475
Journal volume & issue
Vol. 23, no. 11
p. 1475

Abstract

Read online

Variational inference is an optimization-based method for approximating the posterior distribution of the parameters in Bayesian probabilistic models. A key challenge of variational inference is to approximate the posterior with a distribution that is computationally tractable yet sufficiently expressive. We propose a novel method for generating samples from a highly flexible variational approximation. The method starts with a coarse initial approximation and generates samples by refining it in selected, local regions. This allows the samples to capture dependencies and multi-modality in the posterior, even when these are absent from the initial approximation. We demonstrate theoretically that our method always improves the quality of the approximation (as measured by the evidence lower bound). In experiments, our method consistently outperforms recent variational inference methods in terms of log-likelihood and ELBO across three example tasks: the Eight-Schools example (an inference task in a hierarchical model), training a ResNet-20 (Bayesian inference in a large neural network), and the Mushroom task (posterior sampling in a contextual bandit problem).

Keywords