Back to Search Start Over

STOCHASTIC MODEL-BASED MINIMIZATION OF WEAKLY CONVEX FUNCTIONS.

Authors :
DAVIS, DAMEK
DRUSVYATSKIY, DMITRIY
Source :
SIAM Journal on Optimization. 2019, Vol. 29 Issue 1, p207-239. 33p.
Publication Year :
2019

Abstract

We consider a family of algorithms that successively sample and minimize simple stochastic models of the objective function. We show that under reasonable conditions on approximation quality and regularity of the models, any such algorithm drives a natural stationarity measure to zero at the rate O(k-1/4) As a consequence, we obtain the first complexity guarantees for the stochastic proximal point, proximal subgradient, and regularized Gauss{Newton methods for minimizing compositions of convex functions with smooth maps. The guiding principle, underlying the complexity guarantees, is that all algorithms under consideration can be interpreted as approximate descent methods on an implicit smoothing of the problem, given by the Moreau envelope. Specializing to classical circumstances, we obtain the long-sought convergence rate of the stochastic projected gradient method, without batching, for minimizing a smooth function on a closed convex set. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
29
Issue :
1
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
136148931
Full Text :
https://doi.org/10.1137/18M1178244