Back to Search Start Over

New accelerated conjugate gradient algorithms as a modification of Dai–Yuan’s computational scheme for unconstrained optimization

Authors :
Andrei, Neculai
Source :
Journal of Computational & Applied Mathematics. Oct2010, Vol. 234 Issue 12, p3397-3410. 14p.
Publication Year :
2010

Abstract

Abstract: New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algorithms may differ from 1 by two orders of magnitude and tend to vary in a very unpredictable manner, the algorithms are equipped with an acceleration scheme able to improve the efficiency of the algorithms. Computational results for a set consisting of 750 unconstrained optimization test problems show that these new conjugate gradient algorithms substantially outperform the Dai–Yuan conjugate gradient algorithm and its hybrid variants, Hestenes–Stiefel, Polak–Ribière–Polyak, CONMIN conjugate gradient algorithms, limited quasi-Newton algorithm LBFGS and compare favorably with CG_DESCENT. In the frame of this numerical study the accelerated scaled memoryless BFGS preconditioned conjugate gradient ASCALCG algorithm proved to be more robust. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
03770427
Volume :
234
Issue :
12
Database :
Academic Search Index
Journal :
Journal of Computational & Applied Mathematics
Publication Type :
Academic Journal
Accession number :
52224807
Full Text :
https://doi.org/10.1016/j.cam.2010.05.002