Back to Search Start Over

QReg: On Regularization Effects of Quantization

Authors :
AskariHemmat, MohammadHossein
Hemmat, Reyhane Askari
Hoffman, Alex
Lazarevich, Ivan
Saboori, Ehsan
Mastropietro, Olivier
Sah, Sudhakar
Savaria, Yvon
David, Jean-Pierre
Publication Year :
2022

Abstract

In this paper we study the effects of quantization in DNN training. We hypothesize that weight quantization is a form of regularization and the amount of regularization is correlated with the quantization level (precision). We confirm our hypothesis by providing analytical study and empirical results. By modeling weight quantization as a form of additive noise to weights, we explore how this noise propagates through the network at training time. We then show that the magnitude of this noise is correlated with the level of quantization. To confirm our analytical study, we performed an extensive list of experiments summarized in this paper in which we show that the regularization effects of quantization can be seen in various vision tasks and models, over various datasets. Based on our study, we propose that 8-bit quantization provides a reliable form of regularization in different vision tasks and models.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2206.12372
Document Type :
Working Paper