IEEE Access (Jan 2020)

Nonconvex Sparse Representation With Slowly Vanishing Gradient Regularizers

  • Eunwoo Kim,
  • Minsik Lee,
  • Songhwai Oh

DOI
https://doi.org/10.1109/ACCESS.2020.3009971
Journal volume & issue
Vol. 8
pp. 132489 – 132501

Abstract

Read online

Sparse representation has been widely used over the past decade in computer vision and signal processing to model a wide range of natural phenomena. For computational convenience and robustness against noises, the optimization problem for sparse representation is often relaxed using convex or nonconvex surrogates instead of using the l0-norm, the ideal sparsity penalty function. In this paper, we pose the following question for nonconvex sparsity-promoting surrogates: What is a good sparsity surrogate for general nonconvex systems? As an answer to this question, we suggest that the difficulty of handling the l0-norm does not only come from the nonconvexity but also from its gradient being zero or not well-defined. Accordingly, we propose desirable criteria to be a good nonconvex surrogate and suggest a corresponding family of surrogates. The proposed family of surrogates allows a simple regularizer, which enables efficient computation. The proposed surrogate embraces the benefits of both l0 and l1-norms, and most importantly, its gradient vanishes slowly, which allows stable optimization. We apply the proposed surrogate to wellknown sparse representation problems and benchmark datasets to demonstrate its robustness and efficiency.

Keywords