Entropy (Jul 2017)

On Extractable Shared Information

  • Johannes Rauh,
  • Pradeep Kr. Banerjee,
  • Eckehard Olbrich,
  • Jürgen Jost,
  • Nils Bertschinger

DOI
https://doi.org/10.3390/e19070328
Journal volume & issue
Vol. 19, no. 7
p. 328

Abstract

Read online

We consider the problem of quantifying the information shared by a pair of random variables X 1 , X 2 about another variable S. We propose a new measure of shared information, called extractable shared information, that is left monotonic; that is, the information shared about S is bounded from below by the information shared about f ( S ) for any function f. We show that our measure leads to a new nonnegative decomposition of the mutual information I ( S ; X 1 X 2 ) into shared, complementary and unique components. We study properties of this decomposition and show that a left monotonic shared information is not compatible with a Blackwell interpretation of unique information. We also discuss whether it is possible to have a decomposition in which both shared and unique information are left monotonic.

Keywords