Results in Applied Mathematics (Oct 2019)

Some bounds for skewed α-Jensen-Shannon divergence

  • Takuya Yamano

Journal volume & issue
Vol. 3

Abstract

Read online

Based on the skewed Kullback-Leibler divergence introduced in the natural language processing, we derive the upper and lower bounds on the skewed version of the Jensen-Shannon divergence and investigate properties of them. In the process, we generalize the Bretagnolle-Huber inequality that offers an upper bound on the skewed Kullback-Leibler divergence. We further present how the skewed Jensen-Shannon divergence is bounded from below in terms of accuracy mismatch.