A New Family of Hybrid Three-Term Conjugate Gradient BNC-BTC Based on Scaled Memoryless BFGS Update for Unconstrained Optimization Problems
Abstract
Conjugate gradient (CG) methods are an important enhancement to the category of techniques utilized for resolving unconstrained optimization problems. However, some of the existing CG algorithms are not the most effective solution for all different kinds of problems. Particularly, for some problems, traditional CG techniques may show slower convergence rates or even fail to converge. These inefficiencies frequently result from large-scale issues' incapacity to maintain suitable descent directions or to accurately approximate the Hessian matrix. Hence, this paper introduces a new hybrid CG method for solving unconstrained optimization problems. The method proposed in this study incorporates two parameters, as proposed by Hassan and Alashoor, and aligns with the memoryless Broyden-Fletcher-Goldfarb-Shanno (BFGS) quasi-Newton approach. This approach satisfies the descent requirement and has the potential to achieve global convergence, presuming that the Wolfe and Armijo-like conditions and any other prerequisite assumptions are satisfied. Numerical experiments on certain benchmark test issues are performed, and the results show that the proposed method is more efficient than other existing methods.