Back to Search Start Over

A decentralized Nesterov gradient method for stochastic optimization over unbalanced directed networks.

Authors :
Hu, Jinhui
Xia, Dawen
Cheng, Huqiang
Feng, Liping
Ji, Lianghao
Guo, Jing
Li, Huaqing
Source :
Asian Journal of Control; Mar2022, Vol. 24 Issue 2, p576-593, 18p
Publication Year :
2022

Abstract

Decentralized stochastic gradient methods play significant roles in large‐scale optimization that finds many practical applications in machine learning and coordinated control. This paper studies optimization problems over unbalanced directed networks, where the mutual goal of agents in the network is to optimize a global objective function expressed as a sum of local objective functions. Each agent using only local computation and communication in the networks is assumed to get access to a stochastic first‐order oracle. In order to devise a noise‐tolerant decentralized algorithm with accelerated linear convergence, a decentralized Nesterov gradient algorithm with the constant step‐size and parameter using stochastic gradients is proposed in this paper. The proposed algorithm employing a gradient‐tracking technique is proved to converge linearly to an error ball around the optimal solution via the analysis on a linear system when the positive constant step‐size and parameter are sufficiently small. We further recover the exact linear convergence for the proposed algorithm with exact gradients under the same selection conditions of the constant step‐size and parameter. Some real‐world data sets are used in simulations to validate the correctness of the theoretical findings and practicability of the proposed algorithm. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15618625
Volume :
24
Issue :
2
Database :
Complementary Index
Journal :
Asian Journal of Control
Publication Type :
Academic Journal
Accession number :
155977958
Full Text :
https://doi.org/10.1002/asjc.2483