AIMS Mathematics (Mar 2019)

Relative entropy minimization over Hilbert spaces via Robbins-Monro

  • Gideon Simpson,
  • Daniel Watkins

DOI
https://doi.org/10.3934/math.2019.3.359
Journal volume & issue
Vol. 4, no. 3
pp. 359 – 383

Abstract

Read online

One way of getting insight into non-Gaussian measures is to first obtain good Gaussian approximations. These best fit Gaussians can then provide a sense of the mean and variance of the distribution of interest. They can also be used to accelerate sampling algorithms. This begs the question of how one should measure optimality, and how to then obtain this optimal approximation. Here, we consider the problem of minimizing the distance between a family of Gaussians and the target measure with respect to relative entropy, or Kullback-Leibler divergence. As we are interested in applications in the infinite dimensional setting, it is desirable to have convergent algorithms that are well posed on abstract Hilbert spaces. We examine this minimization problem by seeking roots of the first variation of relative entropy, taken with respect to the mean of the Gaussian, leaving the covariance fixed. We prove the convergence of Robbins-Monro type root finding algorithms in this context, highlighting the assumptions necessary for convergence to relative entropy minimizers. Numerical examples are included to illustrate the algorithms.

Keywords