Mathematics (Nov 2020)
Inequalities for Information Potentials and Entropies
Abstract
We consider a probability distribution p0(x),p1(x),… depending on a real parameter x. The associated information potential is S(x):=∑kpk2(x). The Rényi entropy and the Tsallis entropy of order 2 can be expressed as R(x)=−logS(x) and T(x)=1−S(x). We establish recurrence relations, inequalities and bounds for S(x), which lead immediately to similar relations, inequalities and bounds for the two entropies. We show that some sequences Rn(x)n≥0 and Tn(x)n≥0, associated with sequences of classical positive linear operators, are concave and increasing. Two conjectures are formulated involving the information potentials associated with the Durrmeyer density of probability, respectively the Bleimann–Butzer–Hahn probability distribution.
Keywords