Back to Search Start Over

Exponential Concentration in Stochastic Approximation

Authors :
Law, Kody
Walton, Neil
Yang, Shangda
Publication Year :
2022

Abstract

We analyze the behavior of stochastic approximation algorithms where iterates, in expectation, progress towards an objective at each step. When progress is proportional to the step size of the algorithm, we prove exponential concentration bounds. These tail-bounds contrast asymptotic normality results, which are more frequently associated with stochastic approximation. The methods that we develop rely on a geometric ergodicity proof. This extends a result on Markov chains due to Hajek (1982) to the area of stochastic approximation algorithms. We apply our results to several different Stochastic Approximation algorithms, specifically Projected Stochastic Gradient Descent, Kiefer-Wolfowitz and Stochastic Frank-Wolfe algorithms. When applicable, our results prove faster $O(1/t)$ and linear convergence rates for Projected Stochastic Gradient Descent with a non-vanishing gradient.<br />Comment: 35 pages, 11 Figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2208.07243
Document Type :
Working Paper