Back to Search Start Over

ON THE CONVERGENCE OF MIRROR DESCENT BEYOND STOCHASTIC CONVEX PROGRAMMING.

Authors :
ZHENGYUAN ZHOU
MERTIKOPOULOS, PANAYOTIS
BAMBOS, NICHOLAS
BOYD, STEPHEN P.
GLYNN, PETER W.
Source :
SIAM Journal on Optimization. 2020, Vol. 30 Issue 1, p687-716. 30p.
Publication Year :
2020

Abstract

In this paper, we examine the convergence of mirror descent in a class of stochastic optimization problems that are not necessarily convex (or even quasi-convex) and which we call variationally coherent. Since the standard technique of ``ergodic averaging"" offers no tangible benefits beyond convex programming, we focus directly on the algorithm's last generated sample (its ``last iterate""), and we show that it converges with probabiility 1 if the underlying problem is coherent. We further consider a localized version of variational coherence which ensures local convergence of Stochastic mirror descent (SMD) with high probability. These results contribute to the landscape of nonconvex stochastic optimization by showing that (quasi-)convexity is not essential for convergence to a global minimum: rather, variational coherence, a much weaker requirement, suffices. Finally, building on the above, we reveal an interesting insight regarding the convergence speed of SMD: in problems with sharp minima (such as generic linear programs or concave minimization problems), SMD reaches a minimum point in a finite number of steps (a.s.), even in the presence of persistent gradient noise. This result is to be contrasted with existing black-box convergence rate estimates that are only asymptotic. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
30
Issue :
1
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
148483336
Full Text :
https://doi.org/10.1137/17M1134925