Back to Search
Start Over
Comparison of sampling techniques for Bayesian parameter estimation
- Publication Year :
- 2013
-
Abstract
- The posterior probability distribution for a set of model parameters encodes all that the data have to tell us in the context of a given model; it is the fundamental quantity for Bayesian parameter estimation. In order to infer the posterior probability distribution we have to decide how to explore parameter space. Here we compare three prescriptions for how parameter space is navigated, discussing their relative merits. We consider Metropolis-Hasting sampling, nested sampling and affine-invariant ensemble MCMC sampling. We focus on their performance on toy-model Gaussian likelihoods and on a real-world cosmological data set. We outline the sampling algorithms themselves and elaborate on performance diagnostics such as convergence time, scope for parallelisation, dimensional scaling, requisite tunings and suitability for non-Gaussian distributions. We find that nested sampling delivers high-fidelity estimates for posterior statistics at low computational cost, and should be adopted in favour of Metropolis-Hastings in many cases. Affine-invariant MCMC is competitive when computing clusters can be utilised for massive parallelisation. Affine-invariant MCMC and existing extensions to nested sampling naturally probe multi-modal and curving distributions.<br />Comment: 13 pages, 6 figures
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1308.2675
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1093/mnras/stt2190