Back to Search
Start Over
CONVERGENCE RATES FOR DETERMINISTIC AND STOCHASTIC SUBGRADIENT METHODS WITHOUT LIPSCHITZ CONTINUITY.
- Source :
-
SIAM Journal on Optimization . 2019, Vol. 29 Issue 2, p1350-1365. 16p. - Publication Year :
- 2019
-
Abstract
- We extend the classic convergence rate theory for subgradient methods to apply to non-Lipschitz functions. For the deterministic projected subgradient method, we present a global O(1/ √ T) convergence rate for any convex function which is locally Lipschitz around its minimizers. This approach is based on Shor's classic subgradient analysis and implies generalizations of the standard convergence rates for gradient descent on functions with Lipschitz or Hölder continuous gradients. Further, we show a O(1/ √ T) convergence rate for the stochastic projected subgradient method on convex functions with at most quadratic growth, which improves to O(1/T) under either strong convexity or a weaker quadratic lower bound condition. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 10526234
- Volume :
- 29
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- SIAM Journal on Optimization
- Publication Type :
- Academic Journal
- Accession number :
- 139017637
- Full Text :
- https://doi.org/10.1137/18M117306X