Information (Aug 2024)

Minimum Mean Squared Error Estimation and Mutual Information Gain

  • Jerry Gibson

DOI
https://doi.org/10.3390/info15080497
Journal volume & issue
Vol. 15, no. 8
p. 497

Abstract

Read online

Information theoretic quantities such as entropy, entropy rate, information gain, and relative entropy are often used to understand the performance of intelligent agents in learning applications. Mean squared error has not played a role in these analyses, primarily because it is not felt to be a viable performance indicator in these scenarios. We build on a new quantity, the log ratio of entropy powers, to establish that minimum mean squared error (MMSE) estimation, prediction, and smoothing are directly connected to mutual information gain or loss in an agent learning system modeled by a Markov chain for many probability distributions of interest. Expressions for mutual information gain or loss are developed for MMSE estimation, prediction, and smoothing, and an example for fixed lag smoothing is presented.

Keywords