Back to Search
Start Over
Quantized Distributed Nonconvex Optimization with Linear Convergence
- Publication Year :
- 2022
-
Abstract
- This paper considers distributed nonconvex optimization for minimizing the average of local cost functions, by using local information exchange over undirected communication networks. Since the communication channels often have limited bandwidth or capacity, we first introduce a quantization rule and an encoder/decoder scheme to reduce the transmission bits. By integrating them with a distributed algorithm, we then propose a distributed quantized nonconvex optimization algorithm. Assuming the global cost function satisfies the Polyak-Lojasiewicz condition, which does not require the global cost function to be convex and the global minimizer is not necessarily unique, we show that the proposed algorithm linearly converges to a global optimal point. Moreover, a low data rate is shown to be sufficient to ensure linear convergence when the algorithm parameters are properly chosen. The theoretical results are illustrated by numerical simulation examples.<br />QC 20230503
Details
- Database :
- OAIster
- Notes :
- Xu, Lei, Yi, Xinlei, Sun, Jiayue, Shi, Yang, Johansson, Karl H., Yang, Tao
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1400069151
- Document Type :
- Electronic Resource
- Full Text :
- https://doi.org/10.1109.CDC51059.2022.9992989