Entropy (Jun 2025)

Improving the Minimum Free Energy Principle to the Maximum Information Efficiency Principle

  • Chenguang Lu

DOI
https://doi.org/10.3390/e27070684
Journal volume & issue
Vol. 27, no. 7
p. 684

Abstract

Read online

Friston proposed the Minimum Free Energy Principle (FEP) based on the Variational Bayesian (VB) method. This principle emphasizes that the brain and behavior coordinate with the environment, promoting self-organization. However, it has a theoretical flaw, a possibility of being misunderstood, and a limitation (only likelihood functions are used as constraints). This paper first introduces the semantic information G theory and the R(G) function (where R is the minimum mutual information for the given semantic mutual information G). The G theory is based on the P-T probability framework and, therefore, allows for the use of truth, membership, similarity, and distortion functions (related to semantics) as constraints. Based on the study of the R(G) function and logical Bayesian Inference, this paper proposes the Semantic Variational Bayesian (SVB) and the Maximum Information Efficiency (MIE) principle. Theoretic analysis and computing experiments prove that R − G = F − H(X|Y) (where F denotes VFE, and H(X|Y) is Shannon conditional entropy) instead of F continues to decrease when optimizing latent variables; SVB is a reliable and straightforward approach for latent variables and active inference. This paper also explains the relationship between information, entropy, free energy, and VFE in local non-equilibrium and equilibrium systems, concluding that Shannon information, semantic information, and VFE are analogous to the increment of free energy, the increment of exergy, and physical conditional entropy. The MIE principle builds upon the fundamental ideas of the FEP, making them easier to understand and apply. It needs to combine deep learning methods for wider applications.

Keywords