Back to Search Start Over

Conditional gradient method for vector optimization.

Authors :
Chen, Wang
Yang, Xinmin
Zhao, Yong
Source :
Computational Optimization & Applications; Jul2023, Vol. 85 Issue 3, p857-896, 40p
Publication Year :
2023

Abstract

In this paper, we propose a conditional gradient method for solving constrained vector optimization problems with respect to a partial order induced by a closed, convex and pointed cone with nonempty interior. When the partial order under consideration is the one induced by the non-negative orthant, we regain the method for multiobjective optimization recently proposed by Assunção et al. (Comput Optim Appl 78(3):741–768, 2021). In our method, the construction of the auxiliary subproblem is based on the well-known oriented distance function. Three different types of step size strategies (Armijo, adaptative and nonmonotone) are considered. Without convexity assumption related to the objective function, we obtain the stationarity of accumulation points of the sequences produced by the proposed method equipped with the Armijo or the nonmonotone step size rule. To obtain the convergence result of the method with the adaptative step size strategy, we introduce a useful cone convexity condition which allows us to circumvent the intricate question of the Lipschitz continuity of Jocabian for the objective function. This condition helps us to generalize the classical descent lemma to the vector optimization case. Under convexity assumption for the objective function, it is proved that all accumulation points of any generated sequences obtained by our method are weakly efficient solutions. Numerical experiments illustrating the practical behavior of the methods are presented. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09266003
Volume :
85
Issue :
3
Database :
Complementary Index
Journal :
Computational Optimization & Applications
Publication Type :
Academic Journal
Accession number :
164706779
Full Text :
https://doi.org/10.1007/s10589-023-00478-z