Back to Search Start Over

Some Primal-Dual Theory for Subgradient Methods for Strongly Convex Optimization

Authors :
Grimmer, Benjamin
Li, Danlin
Publication Year :
2023

Abstract

We consider (stochastic) subgradient methods for strongly convex but potentially nonsmooth non-Lipschitz optimization. We provide new equivalent dual descriptions (in the style of dual averaging) for the classic subgradient method, the proximal subgradient method, and the switching subgradient method. These equivalences enable $O(1/T)$ convergence guarantees in terms of both their classic primal gap and a not previously analyzed dual gap for strongly convex optimization. Consequently, our theory provides these classic methods with simple, optimal stopping criteria and optimality certificates at no added computational cost. Our results apply under nearly any stepsize selection and for a range of non-Lipschitz ill-conditioned problems where the early iterations of the subgradient method may diverge exponentially quickly (a phenomenon which, to the best of our knowledge, no prior works address). Even in the presence of such undesirable behaviors, our theory still ensures and bounds eventual convergence.<br />29 pages

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....d180871bc223ff79a9865b945101d693