Journal of Inequalities and Applications (Jan 2017)

Regularized gradient-projection methods for finding the minimum-norm solution of the constrained convex minimization problem

  • Ming Tian,
  • Hui-Fang Zhang

DOI
https://doi.org/10.1186/s13660-016-1289-4
Journal volume & issue
Vol. 2017, no. 1
pp. 1 – 12

Abstract

Read online

Abstract Let H be a real Hilbert space and C be a nonempty closed convex subset of H. Assume that g is a real-valued convex function and the gradient ∇g is 1 L $\frac{1}{L}$ -ism with L > 0 $L>0$ . Let 0 < λ < 2 L + 2 $0<\lambda <\frac{2}{L+2}$ , 0 < β n < 1 $0<\beta_{n}<1$ . We prove that the sequence { x n } $\{x_{n}\} $ generated by the iterative algorithm x n + 1 = P C ( I − λ ( ∇ g + β n I ) ) x n $x_{n+1}=P_{C}(I-\lambda(\nabla g+\beta_{n}I))x_{n}$ , ∀ n ≥ 0 $\forall n\geq0$ converges strongly to q ∈ U $q\in U$ , where q = P U ( 0 ) $q=P_{U}(0)$ is the minimum-norm solution of the constrained convex minimization problem, which also solves the variational inequality 〈 − q , p − q 〉 ≤ 0 $\langle-q, p-q\rangle\leq0$ , ∀ p ∈ U $\forall p\in U$ . Under suitable conditions, we obtain some strong convergence theorems. As an application, we apply our algorithm to solving the split feasibility problem in Hilbert spaces.

Keywords