Back to Search Start Over

Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions

Authors :
Zhou Sheng
Gonglin Yuan
Xiaoliang Wang
Source :
Numerical Algorithms. 84:935-956
Publication Year :
2019
Publisher :
Springer Science and Business Media LLC, 2019.

Abstract

It is well-known that conjugate gradient algorithms are widely applied in many practical fields, for instance, engineering problems and finance models, as they are straightforward and characterized by a simple structure and low storage. However, challenging problems remain, such as the convergence of the PRP algorithms for nonconvexity under an inexact line search, obtaining a sufficient descent for all conjugate gradient methods, and other theory properties regarding global convergence and the trust region feature for nonconvex functions. This paper studies family conjugate gradient formulas based on the six classic formulas, PRP, HS, CD, FR, LS, and DY, where the family conjugate gradient algorithms have better theory properties than those of the formulas by themselves. Furthermore, this technique of the presented conjugate gradient formulas can be extended to any two-term conjugate gradient formula. This paper designs family conjugate gradient algorithms for nonconvex functions, which have the following features without other conditions: (i) the sufficient descent property holds, (ii) the trust region feature is true, and (iii) the global convergence holds under normal assumptions. Numerical results show that the given conjugate gradient algorithms are competitive with those of normal methods.

Details

ISSN :
15729265 and 10171398
Volume :
84
Database :
OpenAIRE
Journal :
Numerical Algorithms
Accession number :
edsair.doi...........66a128b565b65f39268984999cec7f2c