Back to Search Start Over

NONLINEAR CONJUGATE GRADIENT METHODS FOR VECTOR OPTIMIZATION.

Authors :
PÉREZ, L. R. LUCAMBIO
PRUDENTE, L. F.
Source :
SIAM Journal on Optimization. 2018, Vol. 28 Issue 3, p2690-2720. 31p.
Publication Year :
2018

Abstract

In this work, we propose nonlinear conjugate gradient methods for finding critical points of vector-valued functions with respect to the partial order induced by a closed, convex, and pointed cone with nonempty interior. No convexity assumption is made on the objectives. The concepts of Wolfe and Zoutendjik conditions are extended for the vector-valued optimization. In particular, we show that there exist intervals of step sizes satisfying the Wolfe-type conditions. The convergence analysis covers the vector extensions of the Fletcher{Reeves, conjugate descent, Dai-Yuan, Polak-Ribière-Polyak, and Hestenes-Stiefel parameters that retrieve the classical ones in the scalar minimization case. Under inexact line searches and without regular restarts, we prove that the sequences generated by the proposed methods find points that satisfy the first-order necessary condition for Pareto-optimality. Numerical experiments illustrating the practical behavior of the methods are presented. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
28
Issue :
3
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
132350829
Full Text :
https://doi.org/10.1137/17M1126588