Back to Search Start Over

A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search.

Authors :
Hager, William W.
Hongchao Zhang
Source :
SIAM Journal on Optimization; 2005, Vol. 16 Issue 1, p170-192, 23p, 1 Chart, 6 Graphs
Publication Year :
2005

Abstract

A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes--Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition ${\bf g}_k^{\sf T} {\bf d}_k \le -\frac{7}{8} \|{\bf g}_k\|^2$. Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A new line search scheme is developed that is efficient and highly accurate. Efficiency is achieved by exploiting properties of linear interpolants in a neighborhood of a local minimizer. High accuracy is achieved by using a convergence criterion, which we call the ``approximate Wolfe'' conditions, obtained by replacing the sufficient decrease criterion in the Wolfe conditions with an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion. Numerical comparisons are given with both L-BFGS and conjugate gradient methods using the unconstrained optimization problems in the CUTE library. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
16
Issue :
1
Database :
Complementary Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
18491378
Full Text :
https://doi.org/10.1137/030601880