Back to Search Start Over

Distributed Stochastic Consensus Optimization With Momentum for Nonconvex Nonsmooth Problems.

Authors :
Wang, Zhiguo
Zhang, Jiawei
Chang, Tsung-Hui
Li, Jian
Luo, Zhi-Quan
Source :
IEEE Transactions on Signal Processing; 11/15/2021, p4486-4501, 16p
Publication Year :
2021

Abstract

While many distributed optimization algorithms have been proposed for solving smooth or convex problems over the networks, few of them can handle non-convex and non-smooth problems. Based on a proximal primal-dual approach, this paper presents a new (stochastic) distributed algorithm with Nesterov momentum for accelerated optimization of non-convex and non-smooth problems. Theoretically, we show that the proposed algorithm can achieve an $\epsilon$ -stationary solution under a constant step size with $\mathcal {O}(1/\epsilon ^2)$ computation complexity and $\mathcal {O}(1/\epsilon)$ communication complexity when the epigraph of the non-smooth term is a polyhedral set. When compared to the existing gradient tracking based methods, the proposed algorithm has the same order of computation complexity but lower order of communication complexity. To the best of our knowledge, the presented result is the first stochastic algorithm with the $\mathcal {O}(1/\epsilon)$ communication complexity for non-convex and non-smooth problems. Numerical experiments for a distributed non-convex regression problem and a deep neural network based classification problem are presented to illustrate the effectiveness of the proposed algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1053587X
Database :
Complementary Index
Journal :
IEEE Transactions on Signal Processing
Publication Type :
Academic Journal
Accession number :
153880594
Full Text :
https://doi.org/10.1109/TSP.2021.3097211