1. A simple three-term conjugate gradient algorithm for unconstrained optimization
- Author
-
Andrei, Neculai
- Subjects
- *
CONJUGATE gradient methods , *ALGORITHMS , *CONSTRAINED optimization , *APPROXIMATION theory , *NUMERICAL analysis , *COMPARATIVE studies - Abstract
Abstract: A simple three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. This algorithm is a modification of the Hestenes and Stiefel algorithm (Hestenes and Stiefel, 1952) [10], or that of Hager and Zhang (Hager and Zhang, 2005) [23] in such a way that the search direction is descent and it satisfies the conjugacy condition. These properties are independent of the line search. Also, the algorithm could be considered as a modification of the memoryless BFGS quasi-Newton method. The new approximation of the minimum is obtained by the general Wolfe line search, now using a standard acceleration technique developed by Andrei (2009) [27]. For uniformly convex functions, under standard assumptions, the global convergence of the algorithm is proved. Numerical comparisons of the suggested three-term conjugate gradient algorithm versus six other three-term conjugate gradient algorithms, using a set of 750 unconstrained optimization problems, show that all these computational schemes have similar performances, the suggested one being slightly faster and more robust. The proposed three-term conjugate gradient algorithm substantially outperforms the well-known Hestenes and Stiefel conjugate gradient algorithm, as well as the more elaborate CG_DESCENT algorithm. Using five applications from the MINPACK-2 test problem collection (Averick et al., 1992) [25], with variables, we show that the suggested three-term conjugate gradient algorithm is the top performer versus CG_DESCENT. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF