1. vqSGD: Vector Quantized Stochastic Gradient Descent.
- Author
-
Gandikota, Venkata, Kane, Daniel, Maity, Raj Kumar, and Mazumdar, Arya
- Subjects
VECTOR quantization ,BINARY codes ,ERROR-correcting codes ,POINT set theory ,COST control ,STOCHASTIC processes - Abstract
In this work, we present a family of vector quantization schemes vqSGD (Vector-Quantized Stochastic Gradient Descent) that provide an asymptotic reduction in the communication cost with convergence guarantees in first-order distributed optimization. In the process we derive the following fundamental information theoretic fact: $\Theta \left({\frac {d}{R^{2}}}\right)$ bits are necessary and sufficient (up to an additive $O(\log d)$ term) to describe an unbiased estimator $\hat{\boldsymbol {g}}(\boldsymbol {g})$ for any $\boldsymbol {g}$ in the $d$ -dimensional unit sphere, under the constraint that $\| \hat{\boldsymbol {g}}(\boldsymbol {g})\|_{2}\le R$ almost surely, $R > 1$. In particular, we consider a randomized scheme based on the convex hull of a point set, that returns an unbiased estimator of a $d$ -dimensional gradient vector with almost surely bounded norm. We provide multiple efficient instances of our scheme, that are near optimal, and require $o(d)$ bits of communication at the expense of tolerable increase in error. The instances of our quantization scheme are obtained using well-known families of binary error-correcting codes and provide a smooth tradeoff between the communication and the estimation error of quantization. Furthermore, we show that vqSGD also offers automatic privacy guarantees. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF