Entropy (Dec 2021)

On Quantization Errors in Approximate and Sample Entropy

  • Dragana Bajić,
  • Nina Japundžić-Žigon

DOI
https://doi.org/10.3390/e24010073
Journal volume & issue
Vol. 24, no. 1
p. 73

Abstract

Read online

Approximate and sample entropies are acclaimed tools for quantifying the regularity and unpredictability of time series. This paper analyses the causes of their inconsistencies. It is shown that the major problem is a coarse quantization of matching probabilities, causing a large error between their estimated and true values. Error distribution is symmetric, so in sample entropy, where matching probabilities are directly summed, errors cancel each other. In approximate entropy, errors are accumulating, as sums involve logarithms of matching probabilities. Increasing the time series length increases the number of quantization levels, and errors in entropy disappear both in approximate and in sample entropies. The distribution of time series also affects the errors. If it is asymmetric, the matching probabilities are asymmetric as well, so the matching probability errors cease to be mutually canceled and cause a persistent entropy error. Despite the accepted opinion, the influence of self-matching is marginal as it just shifts the error distribution along the error axis by the matching probability quant. Artificial lengthening the time series by interpolation, on the other hand, induces large error as interpolated samples are statistically dependent and destroy the level of unpredictability that is inherent to the original signal.

Keywords