Back to Search Start Over

Sampling for Inference in Probabilistic Models with Fast Bayesian Quadrature

Authors :
Gunter, Tom
Osborne, Michael A.
Garnett, Roman
Hennig, Philipp
Roberts, Stephen J.
Publication Year :
2014
Publisher :
arXiv, 2014.

Abstract

We propose a novel sampling framework for inference in probabilistic models: an active learning approach that converges more quickly (in wall-clock time) than Markov chain Monte Carlo (MCMC) benchmarks. The central challenge in probabilistic inference is numerical integration, to average over ensembles of models or unknown (hyper-)parameters (for example to compute the marginal likelihood or a partition function). MCMC has provided approaches to numerical integration that deliver state-of-the-art inference, but can suffer from sample inefficiency and poor convergence diagnostics. Bayesian quadrature techniques offer a model-based solution to such problems, but their uptake has been hindered by prohibitive computation costs. We introduce a warped model for probabilistic integrands (likelihoods) that are known to be non-negative, permitting a cheap active learning scheme to optimally select sample locations. Our algorithm is demonstrated to offer faster convergence (in seconds) relative to simple Monte Carlo and annealed importance sampling on both synthetic and real-world examples.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....550e32c50170811f03db87cab720c4d7
Full Text :
https://doi.org/10.48550/arxiv.1411.0439