Entropy (Apr 2023)

An Axiomatic Characterization of Mutual Information

  • James Fullwood

DOI
https://doi.org/10.3390/e25040663
Journal volume & issue
Vol. 25, no. 4
p. 663

Abstract

Read online

We characterize mutual information as the unique map on ordered pairs of discrete random variables satisfying a set of axioms similar to those of Faddeev’s characterization of the Shannon entropy. There is a new axiom in our characterization, however, which has no analog for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations.

Keywords