Back to Search Start Over

Low-Precision Stochastic Gradient Langevin Dynamics

Authors :
Zhang, Ruqi
Wilson, Andrew Gordon
De Sa, Christopher
Publication Year :
2022

Abstract

While low-precision optimization has been widely used to accelerate deep learning, low-precision sampling remains largely unexplored. As a consequence, sampling is simply infeasible in many large-scale scenarios, despite providing remarkable benefits to generalization and uncertainty estimation for neural networks. In this paper, we provide the first study of low-precision Stochastic Gradient Langevin Dynamics (SGLD), showing that its costs can be significantly reduced without sacrificing performance, due to its intrinsic ability to handle system noise. We prove that the convergence of low-precision SGLD with full-precision gradient accumulators is less affected by the quantization error than its SGD counterpart in the strongly convex setting. To further enable low-precision gradient accumulators, we develop a new quantization function for SGLD that preserves the variance in each update step. We demonstrate that low-precision SGLD achieves comparable performance to full-precision SGLD with only 8 bits on a variety of deep learning tasks.<br />Published at ICML 2022

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....f98232a1c8d170ddf1a1499adaebf220