Back to Search Start Over

STOCHASTIC MULTILEVEL COMPOSITION OPTIMIZATION ALGORITHMS WITH LEVEL-INDEPENDENT CONVERGENCE RATES.

Authors :
BALASUBRAMANIAN, KRISHNAKUMAR
GHADIMI, SAEED
NGUYEN, ANTHONY
Source :
SIAM Journal on Optimization. 2022, Vol. 32 Issue 2, p519-544. 26p.
Publication Year :
2022

Abstract

In this paper, we study smooth stochastic multilevel composition optimization problems, where the objective function is a nested composition of T functions. We assume access to noisy evaluations of the functions and their gradients, through a stochastic first-order oracle. For solving this class of problems, we propose two algorithms using moving-average stochastic estimates, and analyze their convergence to an e-stationary point of the problem. We show that the first algorithm, which is a generalization of [S. Ghadimi, A. Ruszczynski, and M. Wang, SIAM J. Optim., 30 (2020), pp. 960-979] to the T level case, can achieve a sample complexity of OT(1/ε6) by using minibatches of samples in each iteration, where OT hides constants that depend on T. By modifying this algorithm using linearized stochastic estimates of the function values, we improve the sample complexity to OT(1/ε4). This modification not only removes the requirement of having a minibatch of samples in each iteration, but also makes the algorithm parameter-free and easy to implement. To the best of our knowledge, this is the first time that such an online algorithm designed for the (un)constrained multilevel setting obtains the same sample complexity of the smooth single-level setting, under standard assumptions (unbiasedness and boundedness of the second moments) on the stochastic first-order oracle. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
32
Issue :
2
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
156648391
Full Text :
https://doi.org/10.1137/21M1406222