Fixed Point Theory and Applications (Jan 2009)

Super-Relaxed (<inline-formula> <graphic file="1687-1812-2009-957407-i1.gif"/></inline-formula>)-Proximal Point Algorithms, Relaxed (<inline-formula> <graphic file="1687-1812-2009-957407-i2.gif"/></inline-formula>)-Proximal Point Algorithms, Linear Convergence Analysis, and Nonlinear Variational Inclusions

  • Agarwal RaviP,
  • Verma RamU

Journal volume & issue
Vol. 2009, no. 1
p. 957407

Abstract

Read online

We glance at recent advances to the general theory of maximal (set-valued) monotone mappings and their role demonstrated to examine the convex programming and closely related field of nonlinear variational inequalities. We focus mostly on applications of the super-relaxed ( )-proximal point algorithm to the context of solving a class of nonlinear variational inclusion problems, based on the notion of maximal ( )-monotonicity. Investigations highlighted in this communication are greatly influenced by the celebrated work of Rockafellar (1976), while others have played a significant part as well in generalizing the proximal point algorithm considered by Rockafellar (1976) to the case of the relaxed proximal point algorithm by Eckstein and Bertsekas (1992). Even for the linear convergence analysis for the overrelaxed (or super-relaxed) ( )-proximal point algorithm, the fundamental model for Rockafellar's case does the job. Furthermore, we attempt to explore possibilities of generalizing the Yosida regularization/approximation in light of maximal ( )-monotonicity, and then applying to first-order evolution equations/inclusions.