Back to Search
Start Over
Bayesian Constraint Relaxation
- Publication Year :
- 2018
-
Abstract
- Prior information often takes the form of parameter constraints. Bayesian methods include such information through prior distributions having constrained support. By using posterior sampling algorithms, one can quantify uncertainty without relying on asymptotic approximations. However, sharply constrained priors are (a) not necessary in some settings; and (b) tend to limit modeling scope to a narrow set of distributions that are tractable computationally. Inspired by the vast literature that replaces the slab-and-spike prior with a continuous approximation, we propose to replace the sharp indicator function of the constraint with an exponential kernel, thereby creating a close-to-constrained neighborhood within the Euclidean space in which the constrained subspace is embedded. This kernel decays with distance from the constrained space at a rate depending on a relaxation hyperparameter. By avoiding the sharp constraint, we enable use of off-the-shelf posterior sampling algorithms, such as Hamiltonian Monte Carlo, facilitating automatic computation in broad models. We study the constrained and relaxed distributions under multiple settings, and theoretically quantify their differences. We illustrate the method through multiple novel modeling examples.
- Subjects :
- Statistics - Methodology
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1801.01525
- Document Type :
- Working Paper