Stats (Feb 2024)

On Estimation of Shannon’s Entropy of Maxwell Distribution Based on Progressively First-Failure Censored Data

  • Kapil Kumar,
  • Indrajeet Kumar,
  • Hon Keung Tony Ng

DOI
https://doi.org/10.3390/stats7010009
Journal volume & issue
Vol. 7, no. 1
pp. 138 – 159

Abstract

Read online

Shannon’s entropy is a fundamental concept in information theory that quantifies the uncertainty or information in a random variable or data set. This article addresses the estimation of Shannon’s entropy for the Maxwell lifetime model based on progressively first-failure-censored data from both classical and Bayesian points of view. In the classical perspective, the entropy is estimated using maximum likelihood estimation and bootstrap methods. For Bayesian estimation, two approximation techniques, including the Tierney-Kadane (T-K) approximation and the Markov Chain Monte Carlo (MCMC) method, are used to compute the Bayes estimate of Shannon’s entropy under the linear exponential (LINEX) loss function. We also obtained the highest posterior density (HPD) credible interval of Shannon’s entropy using the MCMC technique. A Monte Carlo simulation study is performed to investigate the performance of the estimation procedures and methodologies studied in this manuscript. A numerical example is used to illustrate the methodologies. This paper aims to provide practical values in applied statistics, especially in the areas of reliability and lifetime data analysis.

Keywords