Back to Search Start Over

Convergence and concentration properties of constant step-size SGD through Markov chains

Authors :
Merad, Ibrahim
Gaïffas, Stéphane
Publication Year :
2023

Abstract

We consider the optimization of a smooth and strongly convex objective using constant step-size stochastic gradient descent (SGD) and study its properties through the prism of Markov chains. We show that, for unbiased gradient estimates with mildly controlled variance, the iteration converges to an invariant distribution in total variation distance. We also establish this convergence in Wasserstein-2 distance in a more general setting compared to previous work. Thanks to the invariance property of the limit distribution, our analysis shows that the latter inherits sub-Gaussian or sub-exponential concentration properties when these hold true for the gradient. This allows the derivation of high-confidence bounds for the final estimate. Finally, under such conditions in the linear case, we obtain a dimension-free deviation bound for the Polyak-Ruppert average of a tail sequence. All our results are non-asymptotic and their consequences are discussed through a few applications.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....a42e8d3a70957c7e64f86749eb734679