Back to Search
Start Over
Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization.
- Source :
-
Mathematical Programming . Sep2022, Vol. 195 Issue 1/2, p649-691. 43p. - Publication Year :
- 2022
-
Abstract
- We consider the problem of minimizing composite functions of the form f (g (x)) + h (x) , where f and h are convex functions (which can be nonsmooth) and g is a smooth vector mapping. In addition, we assume that g is the average of finite number of component mappings or the expectation over a family of random component mappings. We propose a class of stochastic variance-reduced prox-linear algorithms for solving such problems and bound their sample complexities for finding an ϵ -stationary point in terms of the total number of evaluations of the component mappings and their Jacobians. When g is a finite average of N components, we obtain sample complexity O (N + N 4 / 5 ϵ - 1) for both mapping and Jacobian evaluations. When g is a general expectation, we obtain sample complexities of O (ϵ - 5 / 2) and O (ϵ - 3 / 2) for component mappings and their Jacobians respectively. If in addition f is smooth, then improved sample complexities of O (N + N 1 / 2 ϵ - 1) and O (ϵ - 3 / 2) are derived for g being a finite average and a general expectation respectively, for both component mapping and Jacobian evaluations. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00255610
- Volume :
- 195
- Issue :
- 1/2
- Database :
- Academic Search Index
- Journal :
- Mathematical Programming
- Publication Type :
- Academic Journal
- Accession number :
- 159792521
- Full Text :
- https://doi.org/10.1007/s10107-021-01709-z