Kurdistan Journal of Applied Research (Apr 2025)
A Novel Conjugate Gradient Algorithm as a Convex Combination of Classical Conjugate Gradient Methods
Abstract
Conjugate gradient (CG) algorithms are constructive for handling large-scale nonlinear optimization problems. One optimization technique intended to address unconstrained optimization issues effectively is the hybrid conjugate gradient (HCG) algorithm. The HCG algorithm aims to improve convergence properties while keeping computations simple by merging features from other conjugate gradient techniques. In this paper, a new hybrid conjugate gradient algorithm is proposed and analyzed, which is obtained as a convex combination of the Dai-Yuan (DY), Hestenes-Stiefel (HS) and Harger-Zhan (HZ) conjugate gradient methods. The primary objective is to improve convergence efficiency and computational performance. The proposed algorithm is designed to reduce the number of iterations and computational costs compared to traditional CG methods. Numerical experiments on standard unconstrained optimization criteria show that the hybrid method achieves faster convergence, often requiring much fewer iterations to reach a specified gradient norm tolerance or objective function value. Additionally, the per-iteration computational cost remains competitive, as the convex combination framework introduces minimal overhead. Theoretical analysis proves the global convergence of the algorithm under standard assumptions. The results highlight the superior performance of the hybrid method in terms of the number of iterations and the total computational cost, especially for large-scale and unconditional problems. This work advances the development of efficient and robust CG algorithms, offering a practical solution for unconstrained optimization challenges.
Keywords