IEEE Access (Jan 2018)

Reversible Discriminant Analysis

  • Lan Bai,
  • Zhen Wang,
  • Yuan-Hai Shao,
  • Chun-Na Li

DOI
https://doi.org/10.1109/ACCESS.2018.2881256
Journal volume & issue
Vol. 6
pp. 72551 – 72562

Abstract

Read online

Principal component analysis (PCA) and linear discriminant analysis (LDA) have been extended to be a group of classical methods in dimensionality reduction for unsupervised and supervised learning, respectively. However, compared with the PCA, the LDA loses several advantages because of the singularity of its between-class scatter, resulting in singular mapping and restriction of reduced dimension. In this paper, we propose a dimensionality reduction method by defining a full-rank between-class scatter, called reversible discriminant analysis (RDA). Based on the new defined between-class scatter matrix, our RDA obtains a nonsingular mapping. Thus, RDA can reduce the sample space to arbitrary dimension and the mapped sample can be recovered. RDA is also extended to kernel based dimensionality reduction. In addition, PCA and LDA are the special cases of our RDA. Experiments on the benchmark and real problems confirm the effectiveness of the proposed method.

Keywords