Back to Search Start Over

Stochastic Gradient Langevin Dynamics with Variance Reduction

Authors :
Huang, Zhishen
Becker, Stephen
Source :
IJCNN2021 (International Joint Conference on Neural Networks)
Publication Year :
2021

Abstract

Stochastic gradient Langevin dynamics (SGLD) has gained the attention of optimization researchers due to its global optimization properties. This paper proves an improved convergence property to local minimizers of nonconvex objective functions using SGLD accelerated by variance reductions. Moreover, we prove an ergodicity property of the SGLD scheme, which gives insights on its potential to find global minimizers of nonconvex objectives.

Details

Database :
arXiv
Journal :
IJCNN2021 (International Joint Conference on Neural Networks)
Publication Type :
Report
Accession number :
edsarx.2102.06759
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/IJCNN52387.2021.9533646