251. Another hybrid conjugate gradient algorithm for unconstrained optimization.
- Author
-
Neculai Andrei
- Subjects
ALGORITHMS ,SCHLIEREN methods (Optics) ,CONJUGATE gradient methods ,MATHEMATICAL optimization - Abstract
Abstract  Another hybrid conjugate gradient algorithm is subject to analysis. The parameter β k is computed as a convex combination of $$ \beta ^{{HS}}_{k} $$ (Hestenes-Stiefel) and $$ \beta ^{{DY}}_{k} $$ (Dai-Yuan) algorithms, i.e. $$ \beta ^{C}_{k} = {\left( {1 - \theta _{k} } \right)}\beta ^{{HS}}_{k} \theta _{k} \beta ^{{DY}}_{k} $$. The parameter θ k in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (s k , y k ) to satisfy the quasi-Newton equation $$ \nabla ^{2} f{\left( {x_{{k 1}} } \right)}s_{k} = y_{k} $$, where $$ s_{k} = x_{{k 1}} - x_{k} $$ and $$ y_{k} = g_{{k 1}} - g_{k} $$. The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF