Journal of Mathematics (Jan 2023)

A Modified Dai–Liao Conjugate Gradient Method Based on a Scalar Matrix Approximation of Hessian and Its Application

  • Branislav Ivanov,
  • Gradimir V. Milovanović,
  • Predrag S. Stanimirović,
  • Aliyu Muhammed Awwal,
  • Lev A. Kazakovtsev,
  • Vladimir N. Krutikov

DOI
https://doi.org/10.1155/2023/9945581
Journal volume & issue
Vol. 2023

Abstract

Read online

We introduce and investigate proper accelerations of the Dai–Liao (DL) conjugate gradient (CG) family of iterations for solving large-scale unconstrained optimization problems. The improvements are based on appropriate modifications of the CG update parameter in DL conjugate gradient methods. The leading idea is to combine search directions in accelerated gradient descent methods, defined based on the Hessian approximation by an appropriate diagonal matrix in quasi-Newton methods, with search directions in DL-type CG methods. The global convergence of the modified Dai–Liao conjugate gradient method has been proved on the set of uniformly convex functions. The efficiency and robustness of the newly presented methods are confirmed in comparison with similar methods, analyzing numerical results concerning the CPU time, a number of function evaluations, and the number of iterative steps. The proposed method is successfully applied to deal with an optimization problem arising in 2D robotic motion control.