Back to Search Start Over

Finite-Bit Quantization for Distributed Algorithms With Linear Convergence.

Authors :
Michelusi, Nicolo
Scutari, Gesualdo
Lee, Chang-Shen
Source :
IEEE Transactions on Information Theory; Nov2022, Vol. 68 Issue 11, p7254-7280, 27p
Publication Year :
2022

Abstract

This paper studies distributed algorithms for (strongly convex) composite optimization problems over mesh networks, subject to quantized communications. Instead of focusing on a specific algorithmic design, a black-box model is proposed, casting linearly convergent distributed algorithms in the form of fixed-point iterates. The algorithmic model is equipped with a novel random or deterministic Biased Compression (BC) rule on the quantizer design, and a new Adaptive encoding Non-uniform Quantizer (ANQ) coupled with a communication-efficient encoding scheme, which implements the BC-rule using a finite number of bits (below machine precision). This fills a gap existing in most state-of-the-art quantization schemes, such as those based on the popular compression rule, which rely on communication of some scalar signals with negligible quantization error (in practice quantized at the machine precision). A unified communication complexity analysis is developed for the black-box model, determining the average number of bits required to reach a solution of the optimization problem within a target accuracy. It is shown that the proposed BC-rule preserves linear convergence of the unquantized algorithms, and a trade-off between convergence rate and communication cost under ANQ-based quantization is characterized. Numerical results validate our theoretical findings and show that distributed algorithms equipped with the proposed ANQ have more favorable communication cost than algorithms using state-of-the-art quantization rules. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
68
Issue :
11
Database :
Complementary Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
160651130
Full Text :
https://doi.org/10.1109/TIT.2022.3176253