Entropy (Jan 2017)

Entropy, Shannon’s Measure of Information and Boltzmann’s H-Theorem

  • Arieh Ben-Naim

DOI
https://doi.org/10.3390/e19020048
Journal volume & issue
Vol. 19, no. 2
p. 48

Abstract

Read online

We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynamic Entropy. The first is defined on any probability distribution; and therefore it is a very general concept. On the other hand Entropy is defined on a very special set of distributions. Next we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. The entropy measures the uncertainty in the distribution of the locations and momenta of all the particles; as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy. Therefore; much of what has been written on the H-theorem is irrelevant to entropy and the Second Law of Thermodynamics.

Keywords