Back to Search Start Over

Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search.

Authors :
Jiang, Xianzhen
Jian, Jinbao
Source :
Journal of Computational & Applied Mathematics. Mar2019, Vol. 348, p525-534. 10p.
Publication Year :
2019

Abstract

Abstract The conjugate gradient methods (CGMs) are very effective iterative methods for solving large-scale unconstrained optimization. The aim of this work is to improve the Fletcher–Reeves and Dai–Yuan CGMs. First, based on the conjugate parameters of the Fletcher–Reeves (FR) method and the Dai–Yuan (DY) method, and combining the second inequality of the strong Wolfe line search, two new conjugate parameters are constructed. Second, using the two new conjugate parameters, another FR type conjugate parameter is presented. Third, utilizing the strong Wolfe line search to yield the steplength, three improved CGMs are proposed for large-scale unconstrained optimization. Under usual assumptions, the improved methods are all proved to possess sufficient descent property and global convergence. Finally, three group experiments and their corresponding performance profiles are reported, which show that the proposed methods are very promising. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03770427
Volume :
348
Database :
Academic Search Index
Journal :
Journal of Computational & Applied Mathematics
Publication Type :
Academic Journal
Accession number :
133149743
Full Text :
https://doi.org/10.1016/j.cam.2018.09.012