Back to Search Start Over

Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free Optimization

Authors :
Zhou, Kaiwen
So, Anthony Man-Cho
Cheng, James
Publication Year :
2021

Abstract

We show that stochastic acceleration can be achieved under the perturbed iterate framework (Mania et al., 2017) in asynchronous lock-free optimization, which leads to the optimal incremental gradient complexity for finite-sum objectives. We prove that our new accelerated method requires the same linear speed-up condition as the existing non-accelerated methods. Our core algorithmic discovery is a new accelerated SVRG variant with sparse updates. Empirical results are presented to verify our theoretical findings.<br />Comment: 21 pages, 22 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2109.15292
Document Type :
Working Paper