Axioms (Dec 2022)

Representative Points Based on Power Exponential Kernel Discrepancy

  • Zikang Xiong,
  • Yao Xiao,
  • Jianhui Ning,
  • Hong Qin

DOI
https://doi.org/10.3390/axioms11120711
Journal volume & issue
Vol. 11, no. 12
p. 711

Abstract

Read online

Representative points (rep-points) are a set of points that are optimally chosen for representing a big original data set or a target distribution in terms of a statistical criterion, such as mean square error and discrepancy. Most of the existing criteria can only assure the representing properties in the whole variable space. In this paper, a new kernel discrepancy, named power exponential kernel discrepancy (PEKD), is proposed to measure the representativeness of the point set with respect to the general multivariate distribution. Different from the commonly used criteria, PEKD can improve the projection properties of the point set, which is important in high-dimensional circumstances. Some theoretical results are presented for understanding the new discrepancy better and guiding the hyperparameter setting. An efficient algorithm for searching rep-points under the PEKD criterion is presented and its convergence has also been proven. Examples are given to illustrate its potential applications in the numerical integration, uncertainty propagation, and reduction of Markov Chain Monte Carlo chains.

Keywords