1. NONLINEAR CONJUGATE GRADIENT METHODS FOR VECTOR OPTIMIZATION.
- Author
-
PÉREZ, L. R. LUCAMBIO and PRUDENTE, L. F.
- Subjects
- *
CONJUGATE gradient methods , *STOCHASTIC convergence , *CONSTRAINED optimization , *PARETO optimum , *SEARCH algorithms - Abstract
In this work, we propose nonlinear conjugate gradient methods for finding critical points of vector-valued functions with respect to the partial order induced by a closed, convex, and pointed cone with nonempty interior. No convexity assumption is made on the objectives. The concepts of Wolfe and Zoutendjik conditions are extended for the vector-valued optimization. In particular, we show that there exist intervals of step sizes satisfying the Wolfe-type conditions. The convergence analysis covers the vector extensions of the Fletcher{Reeves, conjugate descent, Dai-Yuan, Polak-Ribière-Polyak, and Hestenes-Stiefel parameters that retrieve the classical ones in the scalar minimization case. Under inexact line searches and without regular restarts, we prove that the sequences generated by the proposed methods find points that satisfy the first-order necessary condition for Pareto-optimality. Numerical experiments illustrating the practical behavior of the methods are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF