Axioms (Feb 2024)

Entropy of Difference: A New Tool for Measuring Complexity

  • Pasquale Nardone,
  • Giorgio Sonnino

DOI
https://doi.org/10.3390/axioms13020130
Journal volume & issue
Vol. 13, no. 2
p. 130

Abstract

Read online

We propose a new tool for estimating the complexity of a time series: the entropy of difference (ED). The method is based solely on the sign of the difference between neighboring values in a time series. This makes it possible to describe the signal as efficiently as prior proposed parameters, such as permutation entropy (PE) or modified permutation entropy (mPE). Firstly, this method reduces the size of the sample that is necessary to estimate the parameter value, and secondly it enables the use of the Kullback–Leibler divergence to estimate the “distance” between the time series data and random signals.

Keywords