Back to Search
Start Over
Uncertainty Calibration in Bayesian Neural Networks via Distance-Aware Priors
- Publication Year :
- 2022
-
Abstract
- As we move away from the data, the predictive uncertainty should increase, since a great variety of explanations are consistent with the little available information. We introduce Distance-Aware Prior (DAP) calibration, a method to correct overconfidence of Bayesian deep learning models outside of the training domain. We define DAPs as prior distributions over the model parameters that depend on the inputs through a measure of their distance from the training set. DAP calibration is agnostic to the posterior inference method, and it can be performed as a post-processing step. We demonstrate its effectiveness against several baselines in a variety of classification and regression problems, including benchmarks designed to test the quality of predictive distributions away from the data.
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2207.08200
- Document Type :
- Working Paper