Informatică economică (Jan 2019)

The Kullback-Leibler Divergence Class in Decoding the Chest Sound Pattern

  • Antonio CLIM,
  • Razvan Daniel ZOTA

DOI
https://doi.org/10.12948/issn14531305/23.1.2019.05
Journal volume & issue
Vol. 23, no. 1
pp. 50 – 60

Abstract

Read online

Kullback-Leibler Divergence Class or relative entropy is a special case of broader divergence. It represents a calculation of how one probability distribution diverges from another one, expected probability distribution. Kullback-Leibler divergence has a lot of real-time applications. Even though there is a good progress in the field of medicine, there is a need for a statistical analysis for supporting the emerging requirements. In this paper, we are discussing the application of Kullback-Leibler divergence as a possible method for predicting hypertension by using chest sound recordings and machine learning algorithms. It would have a major out-reached benefit in emergency health care systems. Decoding the chest sound pattern has a wide degree in distinguishing different irregularities and wellbeing states of a person in the medicinal field. The proposed method for the estimation of blood pressure is chest sound analysis using a method that creates a record of sounds delivered by the contracting heart, coming about because of valves and related vessels vibration and analyzing it with the help of Kullback-Leibler divergence and machine algorithm. An analysis using the Kullback-Leibler divergence method will allow finding the difference in chest sound recordings which can be evaluated by a machine learning algorithm. The report also proposes the method for analysis of chest sound recordings in Kullback-Leibler divergence class.

Keywords