Entropy (Apr 2018)

Simulation Study on the Application of the Generalized Entropy Concept in Artificial Neural Networks

  • Krzysztof Gajowniczek,
  • Arkadiusz Orłowski,
  • Tomasz Ząbkowski

DOI
https://doi.org/10.3390/e20040249
Journal volume & issue
Vol. 20, no. 4
p. 249

Abstract

Read online

Artificial neural networks are currently one of the most commonly used classifiers and over the recent years they have been successfully used in many practical applications, including banking and finance, health and medicine, engineering and manufacturing. A large number of error functions have been proposed in the literature to achieve a better predictive power. However, only a few works employ Tsallis statistics, although the method itself has been successfully applied in other machine learning techniques. This paper undertakes the effort to examine the q -generalized function based on Tsallis statistics as an alternative error measure in neural networks. In order to validate different performance aspects of the proposed function and to enable identification of its strengths and weaknesses the extensive simulation was prepared based on the artificial benchmarking dataset. The results indicate that Tsallis entropy error function can be successfully introduced in the neural networks yielding satisfactory results and handling with class imbalance, noise in data or use of non-informative predictors.

Keywords