Back to Search
Start Over
A nonmonotone conditional gradient method for multiobjective optimization problems.
- Source :
-
Soft Computing - A Fusion of Foundations, Methodologies & Applications . Sep2024, Vol. 28 Issue 17/18, p9609-9630. 22p. - Publication Year :
- 2024
-
Abstract
- This study analyzes the conditional gradient method for constrained multiobjective optimization problems, also known as the Frank–Wolfe method. We assume that the objectives are continuously differentiable, and the constraint set is convex and compact. We employ an average-type nonmonotone line search, which takes the average of the recent objective function values. The asymptotic convergence properties without convexity assumptions on the objective functions are established. We prove that every limit point of the sequence of iterates that is obtained by the proposed method is a Pareto critical point. An iteration-complexity bound is provided regardless of the convexity assumption on the objective functions. The effectiveness of the suggested approach is demonstrated by applying it to several benchmark test problems. In addition, the efficiency of the proposed algorithm in generating approximations of the entire Pareto front is compared to the existing Hager–Zhang conjugate gradient method, the steepest descent method, the monotone conditional gradient method, and a nonmonotone conditional gradient method. In finding empirical comparison, we utilize two commonly used performance matrices—inverted generational distance and hypervolume indicators. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 14327643
- Volume :
- 28
- Issue :
- 17/18
- Database :
- Academic Search Index
- Journal :
- Soft Computing - A Fusion of Foundations, Methodologies & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 180373671
- Full Text :
- https://doi.org/10.1007/s00500-024-09806-9