Applied System Innovation (Jul 2022)

Cross-Validation for Lower Rank Matrices Containing Outliers

  • Sergio Arciniegas-Alarcón,
  • Marisol García-Peña,
  • Wojtek J. Krzanowski

DOI
https://doi.org/10.3390/asi5040069
Journal volume & issue
Vol. 5, no. 4
p. 69

Abstract

Read online

Several statistical techniques for analyzing data matrices use lower rank approximations to these matrices, for which, in general, the appropriate rank must first be estimated depending on the objective of the study. The estimation can be conducted by cross-validation (CV), but most methods are not designed to cope with the presence of outliers, a very common problem in data matrices. The literature suggests one option to circumvent the problem, namely, the elimination of the outliers, but such information removal should only be performed when it is possible to verify that an outlier effectively corresponds to a collection or typing error. This paper proposes a methodology that combines the robust singular value decomposition (rSVD) with a CV scheme, and this allows outliers to be taken into account without eliminating them. For this, three possible rSVD’s are considered and six resistant criteria are proposed for the choice of the rank, based on three classic statistics used in multivariate statistics. To test the performance of the various methods, a simulation study and an analysis of real data are described, using an exclusively numerical evaluation through Procrustes statistics and critical angles between subspaces of principal components. We conclude that, when data matrices are contaminated with outliers, the best estimation of rank is the one that uses a CV scheme over a robust lower rank approximation (RLRA) containing as many components as possible. In our experiments, the best results were obtained when this RLRA was calculated using an rSVD that minimizes the L2 norm.

Keywords