Back to Search Start Over

A Modified Self-Scaling Memoryless Broyden-Fletcher-Goldfarb-Shanno Method for Unconstrained Optimization.

Authors :
Kou, C.
Dai, Y.
Source :
Journal of Optimization Theory & Applications. Apr2015, Vol. 165 Issue 1, p209-224. 16p.
Publication Year :
2015

Abstract

The introduction of quasi-Newton and nonlinear conjugate gradient methods revolutionized the field of nonlinear optimization. The self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (SSML-BFGS) method by Perry (Disscussion Paper 269, ) and Shanno (SIAM J Numer Anal, 15, 1247-1257, ) provided a good understanding about the relationship between the two classes of methods. Based on the SSML-BFGS method, new conjugate gradient algorithms, called CG_DESCENT and CGOPT, have been proposed by Hager and Zhang (SIAM J Optim, 16, 170-192, ) and Dai and Kou (SIAM J Optim, 23, 296-320, ), respectively. It is somewhat surprising that the two conjugate gradient methods perform more efficiently than the SSML-BFGS method. In this paper, we aim at proposing some suitable modifications of the SSML-BFGS method such that the sufficient descent condition holds. Convergence analysis of the modified method is made for convex and nonconvex functions, respectively. The numerical experiments for the testing problems from the Constrained and Unconstrained Test Environment collection demonstrate that the modified SSML-BFGS method yields a desirable improvement over CGOPT and the original SSML-BFGS method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00223239
Volume :
165
Issue :
1
Database :
Academic Search Index
Journal :
Journal of Optimization Theory & Applications
Publication Type :
Academic Journal
Accession number :
101805865
Full Text :
https://doi.org/10.1007/s10957-014-0528-4