Algorithms (Apr 2025)

The Prediction Performance Analysis of the Lasso Model with Convex Non-Convex Sparse Regularization

  • Wei Chen,
  • Qiuyue Liu,
  • Hancong Li,
  • Jian Zou

DOI
https://doi.org/10.3390/a18040195
Journal volume & issue
Vol. 18, no. 4
p. 195

Abstract

Read online

The incorporation of ℓ1 regularization in Lasso regression plays a crucial role by inducing convexity to the objective function, thereby facilitating its minimization; when compared to non-convex regularization, the utilization of ℓ1 regularization introduces bias through artificial coefficient shrinkage towards zero. Recently, the convex non-convex (CNC) regularization framework has emerged as a powerful technique that enables the incorporation of non-convex regularization terms while maintaining the overall convexity of the optimization problem. Although this method has shown remarkable performance in various empirical studies, its theoretical understanding is still relatively limited. In this paper, we provide a theoretical investigation into the prediction performance of the Lasso model with CNC sparse regularization. By leveraging oracle inequalities, we establish a tighter upper bound on prediction performance compared to the traditional ℓ1 regularizer. Additionally, we propose an alternating direction method of multipliers (ADMM) algorithm to efficiently solve the proposed model and rigorously analyze its convergence property. Our numerical results, evaluated on both synthetic data and real-world magnetic resonance imaging (MRI) reconstruction tasks, confirm the superior effectiveness of our proposed approach.

Keywords