Back to Search
Start Over
Almost sure convergence rates of stochastic proximal gradient descent algorithm.
- Source :
-
Optimization . Aug2024, Vol. 73 Issue 8, p2413-2446. 34p. - Publication Year :
- 2024
-
Abstract
- Stochastic proximal gradient descent (Prox-SGD) is a standard optimization algorithm for solving stochastic composite optimization problems in machine learning. However, the existing convergence rate analysis of Prox-SGD is mostly focused on convergence in expectation. To this end, we provide an almost sure convergence rate analysis of Prox-SGD in two optimization settings: convexity and strong convexity, showing that any implementation of the stochastic algorithm will converge with probability one. Specifically, we show that the average-iterates almost sure convergence rate of the function values for Prox-SGD is arbitrarily close to the optimal rate $ o(1/\sqrt {T}) $ o (1 / T) in the convex setting. For strongly convex objective functions, we obtain the almost sure convergence rate of the iterative sequence, which is also arbitrarily close to the optimal rate $ o(1/T) $ o (1 / T). Last but not least, we establish the more challenging last-iterate convergence rate of the function values on convex problems, which contrasts with existing results on convergence for a weighted average. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 02331934
- Volume :
- 73
- Issue :
- 8
- Database :
- Academic Search Index
- Journal :
- Optimization
- Publication Type :
- Academic Journal
- Accession number :
- 178808335
- Full Text :
- https://doi.org/10.1080/02331934.2023.2230976