Back to Search Start Over

FAST DECENTRALIZED NONCONVEX FINITE-SUM OPTIMIZATION WITH RECURSIVE VARIANCE REDUCTION.

Authors :
RAN XIN
KHAN, USMAN A.
KAR, SOUMMYA
Source :
SIAM Journal on Optimization. 2022, Vol. 32 Issue 1, p1-28. 28p.
Publication Year :
2022

Abstract

This paper considers decentralized minimization of N := nm smooth nonconvex cost functions equally divided over a directed network of n nodes. Specifically, we describe a stochastic first-order gradient method, called GT-SARAH, that employs a SARAH-type variance reduction technique and gradient tracking (GT) to address the stochastic and decentralized nature of the problem. We show that GT-SARAH, with appropriate algorithmic parameters, finds an ϵ-accurate first-order stationary point with O(max{N1/2, n(1 - λ)-2, n2/3m1/3(1 - λ)-1} Lϵ-2 gradient complexity, where (1 λ) ∈ (0, 1] is the spectral gap of the network weight matrix and L is the smoothness parameter of the cost functions. This gradient complexity outperforms that of the existing decentralized stochastic gradient methods. In particular, in a big-data regime such that n = O(N1/2(1 - λ)³), this gradient complexity furthers reduces to O(N1/2Lϵ-2), independent of the network topology, and matches that of the centralized near-optimal variance-reduced methods. Moreover, in this regime GT-SARAH achieves a nonasymptotic linear speedup in that the total number of gradient computations at each node is reduced by a factor of 1/n compared to the centralized near-optimal algorithms that perform all gradient computations at a single node. To the best of our knowledge, GT-SARAH is the first algorithm that achieves this property. In addition, we show that appropriate choices of local minibatch size balance the trade-offs between the gradient and communication complexity of GT-SARAH. Over infinite time horizon, we establish that all nodes in GT-SARAH asymptotically achieve consensus and converge to a first-order stationary point in the almost sure and mean-squared sense. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
32
Issue :
1
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
155794702
Full Text :
https://doi.org/10.1137/20M1361158