Mathematics (Oct 2024)

Asymptotic Properties of a Statistical Estimator of the Jeffreys Divergence: The Case of Discrete Distributions

  • Vladimir Glinskiy,
  • Artem Logachov,
  • Olga Logachova,
  • Helder Rojas,
  • Lyudmila Serga,
  • Anatoly Yambartsev

DOI
https://doi.org/10.3390/math12213319
Journal volume & issue
Vol. 12, no. 21
p. 3319

Abstract

Read online

We investigate the asymptotic properties of the plug-in estimator for the Jeffreys divergence, the symmetric variant of the Kullback–Leibler (KL) divergence. This study focuses specifically on the divergence between discrete distributions. Traditionally, estimators rely on two independent samples corresponding to two distinct conditions. However, we propose a one-sample estimator where the condition results from a random event. We establish the estimator’s asymptotic unbiasedness (law of large numbers) and asymptotic normality (central limit theorem). Although the results are expected, the proofs require additional technical work due to the randomness of the conditions.

Keywords