Back to Search Start Over

Efficient nonlinear conjugate gradient techniques for vector optimization problems.

Authors :
YAHAYA, JAMILU
KUMAM, POOM
ABUBAKAR, JAMILU
Source :
Carpathian Journal of Mathematics. 2024, Vol. 40 Issue 2, p515-533. 19p.
Publication Year :
2024

Abstract

Conjugate gradient techniques are known for their simplicity and minimal memory usage. However, it is known that in the vector optimization context, the Polak-Ribi'ere-Polyak (PRP), Liu-Storey (LS), and Hestenes-Stiefel (HS) conjugate gradient (CG) techniques fail to satisfy the sufficient descent property using Wolfe line searches. In this work, we propose a variation of the PRP, LS, and HS CG techniques that we termed YPR, YLS, and YHS, respectively. These techniques exhibit the desirable property of sufficient descent without line search, except for the YHS which uses Wolfe line search for its sufficient descent property. Under certain standard assumptions and employing strong Wolfe conditions, we investigate the global convergence properties of the proposed techniques. The global convergence analysis extends beyond convexity assumption on the objective functions. Additionally, we present numerical experiments and comparisons to demonstrate the implementation, efficiency, and robustness of the proposed techniques. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15842851
Volume :
40
Issue :
2
Database :
Academic Search Index
Journal :
Carpathian Journal of Mathematics
Publication Type :
Academic Journal
Accession number :
176288755
Full Text :
https://doi.org/10.37193/CJM.2024.02.18