Back to Search Start Over

Stochastic algorithms with geometric step decay converge linearly on sharp functions.

Authors :
Davis, Damek
Drusvyatskiy, Dmitriy
Charisopoulos, Vasileios
Source :
Mathematical Programming. Sep2024, Vol. 207 Issue 1/2, p145-190. 46p.
Publication Year :
2024

Abstract

Stochastic (sub)gradient methods require step size schedule tuning to perform well in practice. Classical tuning strategies decay the step size polynomially and lead to optimal sublinear rates on (strongly) convex problems. An alternative schedule, popular in nonconvex optimization, is called geometric step decay and proceeds by halving the step size after every few epochs. In recent work, geometric step decay was shown to improve exponentially upon classical sublinear rates for the class of sharp convex functions. In this work, we ask whether geometric step decay similarly improves stochastic algorithms for the class of sharp weakly convex problems. Such losses feature in modern statistical recovery problems and lead to a new challenge not present in the convex setting: the region of convergence is local, so one must bound the probability of escape. Our main result shows that for a large class of stochastic, sharp, nonsmooth, and nonconvex problems a geometric step decay schedule endows well-known algorithms with a local linear (or nearly linear) rate of convergence to global minimizers. This guarantee applies to the stochastic projected subgradient, proximal point, and prox-linear algorithms. As an application of our main result, we analyze two statistical recovery tasks—phase retrieval and blind deconvolution—and match the best known guarantees under Gaussian measurement models and establish new guarantees under heavy-tailed distributions. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
207
Issue :
1/2
Database :
Academic Search Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
178877812
Full Text :
https://doi.org/10.1007/s10107-023-02003-w