Journal of Inequalities and Applications (Jan 2022)

Two efficient modifications of AZPRP conjugate gradient method with sufficient descent property

  • Zabidin Salleh,
  • Adel Almarashi,
  • Ahmad Alhawarat

DOI
https://doi.org/10.1186/s13660-021-02746-0
Journal volume & issue
Vol. 2022, no. 1
pp. 1 – 21

Abstract

Read online

Abstract The conjugate gradient method can be applied in many fields, such as neural networks, image restoration, machine learning, deep learning, and many others. Polak–Ribiere–Polyak and Hestenses–Stiefel conjugate gradient methods are considered as the most efficient methods to solve nonlinear optimization problems. However, both methods cannot satisfy the descent property or global convergence property for general nonlinear functions. In this paper, we present two new modifications of the PRP method with restart conditions. The proposed conjugate gradient methods satisfy the global convergence property and descent property for general nonlinear functions. The numerical results show that the new modifications are more efficient than recent CG methods in terms of number of iterations, number of function evaluations, number of gradient evaluations, and CPU time.

Keywords