Back to Search Start Over

A new family of globally convergent conjugate gradient methods.

Authors :
Sellami, B.
Chaib, Y.
Source :
Annals of Operations Research. Jun2016, Vol. 241 Issue 1/2, p497-513. 17p.
Publication Year :
2016

Abstract

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, a new family of conjugate gradient method is proposed for unconstrained optimization. This method includes the already existing two practical nonlinear conjugate gradient methods, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the Wolfe conditions. The numerical experiments are done to test the efficiency of the new method, which implies the new method is promising. In addition the methods related to this family are uniformly discussed. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02545330
Volume :
241
Issue :
1/2
Database :
Academic Search Index
Journal :
Annals of Operations Research
Publication Type :
Academic Journal
Accession number :
115967629
Full Text :
https://doi.org/10.1007/s10479-016-2120-9