1. AdamB: Decoupled Bayes by Backprop With Gaussian Scale Mixture Prior
- Author
-
Keigo Nishida and Makoto Taiji
- Subjects
Bayesian neural networks ,covariate shift ,decoupled weight decay ,deep neural networks ,uncertainty ,variational inference ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Overfitting of neural networks to training data is one of the most significant problems in machine learning. Bayesian neural networks (BNNs) are known to be robust against overfitting owing to their ability to model parameter uncertainty. Bayes by Backprop (BBB), a simple variational inference approach that optimizes variational parameters by backpropagation, has been proposed to train BNNs. However, many studies have encountered challenges in terms of variational inference for large-scale models, such as deep learning. Thus, this study proposed Adam with decoupled Bayes by Backprop (AdamB) to stabilize the training of BNNs through the application of the Adam estimator evaluation to the gradient of the neural network. The proposed approach stabilized the noisy gradient of the BBB and mitigated excess changes in the parameters. In addition, AdamB combined with a Gaussian scale mixture as a prior distribution can suppress the intrinsic increase in variational parameters. The proposed AdamB exhibited superior stability compared to training using Adam with vanilla BBB. Further, the covariate shift benchmark using image classification tasks indicated the higher reliability of AdamB than deep ensembles in the case of noise-type covariate shifts. The considerations for stable learning of BNNs by AdamB shown in image classification tasks are expected to be important insights for application to other domains.
- Published
- 2022
- Full Text
- View/download PDF