IEEE Access (Jan 2020)

SPD Data Dictionary Learning Based on Kernel Learning and Riemannian Metric

  • Rixin Zhuang,
  • Zhengming Ma,
  • Weijia Feng,
  • Yuanping Lin

DOI
https://doi.org/10.1109/ACCESS.2020.2984941
Journal volume & issue
Vol. 8
pp. 61956 – 61972

Abstract

Read online

The use of regional covariance descriptors to generate feature data represented by Symmetric Positive Definite (SPD) matrices from images or videos has become increasingly common in machine learning. However, SPD data itself does not constitute a vector space, and dictionary learning involves a large number of linear operations, so dictionary learning cannot be performed directly on SPD data. For this reason, a more common method is to map the SPD data to the Reproducing Kernel Hilbert Space (RKHS). The so-called kernel learning is to find the most suitable RKHS for specific tasks. RKHS can be uniquely generated by a kernel function. Therefore, RKHS learning can also be considered as kernel learning. In this article, there are two main contributions. The first contribution is to propose a framework which based on Kernel Learning and Riemannian Metric (KLRM). Usually the learnable kernel function framework is to learn some parameters in the kernel function. The second contribution is dictionary learning by applying KLRM to SPD data. The SPD data is transformed into the RKHS generated by KLRM, and RKHS after training provides the most suitable working space for dictionary learning. Under the proposed framework, we design a positive definite kernel function, which is defined by the Log-Euclidean metric. This function can be transformed into a corresponding Riemannian kernel. The experimental results provided in this paper is compared with other state-of-the-art algorithms for SPD data dictionary learning and show that the proposed algorithm achieves better results.

Keywords