Back to Search Start Over

Asymptotic error analysis for stochastic gradient optimization schemes with first and second order modified equations

Authors :
Bréhier, Charles-Edouard
Dambrine, Marc
En-Nebbazi, Nassim
Publication Year :
2024

Abstract

We consider a class of stochastic gradient optimization schemes. Assuming that the objective function is strongly convex, we prove weak error estimates which are uniform in time for the error between the solution of the numerical scheme, and the solutions of continuous-time modified (or high-resolution) differential equations at first and second orders, with respect to the time-step size. At first order, the modified equation is deterministic, whereas at second order the modified equation is stochastic and depends on a modified objective function. We go beyond existing results where the error estimates have been considered only on finite time intervals and were not uniform in time. This allows us to then provide a rigorous complexity analysis of the method in the large time and small time step size regimes.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.05538
Document Type :
Working Paper