Back to Search Start Over

AI-SARAH: Adaptive and Implicit Stochastic Recursive Gradient Methods

Authors :
Shi, Zheng
Sadiev, Abdurakhmon
Loizou, Nicolas
Richtárik, Peter
Takáč, Martin
Shi, Zheng
Sadiev, Abdurakhmon
Loizou, Nicolas
Richtárik, Peter
Takáč, Martin
Publication Year :
2021

Abstract

We present AI-SARAH, a practical variant of SARAH. As a variant of SARAH, this algorithm employs the stochastic recursive gradient yet adjusts step-size based on local geometry. AI-SARAH implicitly computes step-size and efficiently estimates local Lipschitz smoothness of stochastic functions. It is fully adaptive, tune-free, straightforward to implement, and computationally efficient. We provide technical insight and intuitive illustrations on its design and convergence. We conduct extensive empirical analysis and demonstrate its strong performance compared with its classical counterparts and other state-of-the-art first-order methods in solving convex machine learning problems.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1269530503
Document Type :
Electronic Resource