Back to Search Start Over

Q-SHED: Distributed Optimization at the Edge via Hessian Eigenvectors Quantization

Authors :
Fabbro, Nicolò Dal
Rossi, Michele
Schenato, Luca
Dey, Subhrakanti
Publication Year :
2023

Abstract

Edge networks call for communication efficient (low overhead) and robust distributed optimization (DO) algorithms. These are, in fact, desirable qualities for DO frameworks, such as federated edge learning techniques, in the presence of data and system heterogeneity, and in scenarios where internode communication is the main bottleneck. Although computationally demanding, Newton-type (NT) methods have been recently advocated as enablers of robust convergence rates in challenging DO problems where edge devices have sufficient computational power. Along these lines, in this work we propose Q-SHED, an original NT algorithm for DO featuring a novel bit-allocation scheme based on incremental Hessian eigenvectors quantization. The proposed technique is integrated with the recent SHED algorithm, from which it inherits appealing features like the small number of required Hessian computations, while being bandwidth-versatile at a bit-resolution level. Our empirical evaluation against competing approaches shows that Q-SHED can reduce by up to 60% the number of communication rounds required for convergence.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....6f5ed44e3537d55e69b4c1dceee5df92