Entropy (Jul 2022)

On the Distribution of the Information Density of Gaussian Random Vectors: Explicit Formulas and Tight Approximations

  • Jonathan E. W. Huffmann,
  • Martin Mittelbach

DOI
https://doi.org/10.3390/e24070924
Journal volume & issue
Vol. 24, no. 7
p. 924

Abstract

Read online

Based on the canonical correlation analysis, we derive series representations of the probability density function (PDF) and the cumulative distribution function (CDF) of the information density of arbitrary Gaussian random vectors as well as a general formula to calculate the central moments. Using the general results, we give closed-form expressions of the PDF and CDF and explicit formulas of the central moments for important special cases. Furthermore, we derive recurrence formulas and tight approximations of the general series representations, which allow efficient numerical calculations with an arbitrarily high accuracy as demonstrated with an implementation in Python publicly available on GitLab. Finally, we discuss the (in)validity of Gaussian approximations of the information density.

Keywords