IEEE Access (Jan 2020)

Self-Supervised Feature Specific Neural Matrix Completion

  • Mehmet Aktukmak,
  • Samuel M. Mercier,
  • Ismail Uysal

DOI
https://doi.org/10.1109/ACCESS.2020.3035120
Journal volume & issue
Vol. 8
pp. 198168 – 198177

Abstract

Read online

Unsupervised matrix completion algorithms mostly model the data generation process by using linear latent variable models. Recently proposed algorithms introduce non-linearity via multi-layer perceptrons (MLP), and self-supervision by setting separate linear regression frameworks for each feature to estimate the missing values. In this article, we introduce an MLP based algorithm called feature-specific neural matrix completion (FSNMC), which combines self-supervised and non-linear methods. The model parameters are estimated by a rotational scheme which separates the parameter and missing value updates sequentially with additional heuristic steps to prevent over-fitting and speed up convergence. The proposed algorithm specifically targets small to medium sized datasets. Experimental results on real-world and synthetic datasets varying in size with a range of missing value percentages demonstrate the superior accuracy for FSNMC, especially at low sparsities when compared to popular methods in the literature. The proposed method has particular potential in estimating missing data collected via real experimentation in fundamental life sciences.

Keywords