Journal of Inequalities and Applications (May 2024)

A three-term conjugate gradient descent method with some applications

  • Ahmad Alhawarat,
  • Zabidin Salleh,
  • Hanan Alolaiyan,
  • Hamid El Hor,
  • Shahrina Ismail

DOI
https://doi.org/10.1186/s13660-024-03142-0
Journal volume & issue
Vol. 2024, no. 1
pp. 1 – 21

Abstract

Read online

Abstract The stationary point of optimization problems can be obtained via conjugate gradient (CG) methods without the second derivative. Many researchers have used this method to solve applications in various fields, such as neural networks and image restoration. In this study, we construct a three-term CG method that fulfills convergence analysis and a descent property. Next, in the second term, we employ a Hestenses-Stiefel CG formula with some restrictions to be positive. The third term includes a negative gradient used as a search direction multiplied by an accelerating expression. We also provide some numerical results collected using a strong Wolfe line search with different sigma values over 166 optimization functions from the CUTEr library. The result shows the proposed approach is far more efficient than alternative prevalent CG methods regarding central processing unit (CPU) time, number of iterations, number of function evaluations, and gradient evaluations. Moreover, we present some applications for the proposed three-term search direction in image restoration, and we compare the results with well-known CG methods with respect to the number of iterations, CPU time, as well as root-mean-square error (RMSE). Finally, we present three applications in regression analysis, image restoration, and electrical engineering.

Keywords