Electronic Journal of Differential Equations (Aug 2016)

Bounds for Kullback-Leibler divergence

  • Pantelimon G. Popescu,
  • Sever S. Dragomir,
  • Emil I. Slusanschi,
  • Octavian N. Stanasila

Journal volume & issue
Vol. 2016, no. 237,
pp. 1 – 6

Abstract

Read online

Entropy, conditional entropy and mutual information for discrete-valued random variables play important roles in the information theory. The purpose of this paper is to present new bounds for relative entropy $D(p||q)$ of two probability distributions and then to apply them to simple entropy and mutual information. The relative entropy upper bound obtained is a refinement of a bound previously presented into literature.

Keywords