IEEE Access (Jan 2024)
Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising
Abstract
Image denoising techniques often rely on convex relaxations, which can introduce bias into estimations. To address this, non-convex regularizers like weighted nuclear norm minimization and weighted Schatten p-norm minimization have been proposed. However, current implementations often rely on heuristic weight selection, neglecting the potential of automated strategies. This work introduces a novel non-convex, non-separable regularization term aimed at achieving a hybrid representation that leverages both low-rank (LR) and global sparse gradient (GS) structures. An iteratively auto-weighting Equivalent Minimax $L{\gamma }$ Concave penalty (EMLC) is proposed for non-convex relaxations. To enhance sparsity and improve low-rank estimation, the EMLC-LRGS-based image denoising model is presented. This model integrates global gradient sparsity and LR priors within a unified framework using the EMLC penalty. The formulation addresses limitations of convex relaxations by employing an equivalent representation of the weight minimax $L{\gamma }$ concave penalty as a combined global sparsity and local smoothness regularizer in the gradient domain. This aligns more closely with the data acquisition model and prior knowledge. To exploit the inherent low-rank structure of images, an equivalent representation of the weighted $L{\gamma }$ norm is employed as a low-rank regularization term applied to groups of similar image patches. Efficient model resolution is achieved through an adaptive alternating direction method of multipliers (ADMM) algorithm that dynamically tunes the weighted parameter while promoting sparsity and a low-rank representation. The effectiveness of this approach is demonstrated through comprehensive comparisons with state-of-the-art image denoising models, showcasing its superiority in image denoising tasks.
Keywords