Back to Search Start Over

PROXIMALLY GUIDED STOCHASTIC SUBGRADIENT METHOD FOR NONSMOOTH, NONCONVEX PROBLEMS.

Authors :
DAVIS, DAMEK
GRIMMER, BENJAMIN
Source :
SIAM Journal on Optimization. 2019, Vol. 29 Issue 3, p1908-1930. 23p.
Publication Year :
2019

Abstract

In this paper, we introduce a stochastic projected subgradient method for weakly convex (i.e., uniformly prox-regular) nonsmooth, nonconvex functions---a wide class of functions which includes the additive and convex composite classes. At a high level, the method is an inexact proximal-point iteration in which the strongly convex proximal subproblems are quickly solved with a specialized stochastic projected subgradient method. The primary contribution of this paper is a simple proof that the proposed algorithm converges at the same rate as the stochastic gradient method for smooth nonconvex problems. This result appears to be the first convergence rate analysis of a stochastic (or even deterministic) subgradient method for the class of weakly convex functions. In addition, a two-phase variant is proposed that significantly reduces the variance of the solutions returned by the algorithm. Finally, preliminary numerical experiments are also provided. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
29
Issue :
3
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
144836763
Full Text :
https://doi.org/10.1137/17M1151031