Back to Search Start Over

Fast mixing hyperdynamic sampling

Authors :
Sminchisescu, Cristian
Triggs, Bill
Source :
Image & Vision Computing. Mar2006, Vol. 24 Issue 3, p279-289. 11p.
Publication Year :
2006

Abstract

Abstract: Sequential random sampling (‘Markov Chain Monte-Carlo’) is a popular strategy for many vision problems involving multi-modal distributions over high-dimensional parameter spaces. It applies both to importance sampling (where one wants to sample points according to their ‘importance’ for some calculation, but otherwise fairly) and to global-optimization (where one wants to find good minima, or at least good starting points for local minimization, regardless of fairness). Unfortunately, most sequential samplers are very prone to becoming trapped for long periods in unrepresentative local minima, which leads to biased or highly variable estimates. We present a general strategy for reducing MCMC trapping that generalizes Voter''s ‘hyperdynamic sampling’ from computational chemistry. The local gradient and curvature of the input distribution are used to construct an adaptive importance sampler that focuses samples on negative curvature regions that are likely to contain low cost ‘transition states’ (codimension-1 saddle points representing ‘mountain passes’ connecting adjacent cost basins). This substantially accelerates inter-basin transition rates while still preserving correct relative transition probabilities. Experimental tests on the difficult problem of 3D articulated human pose estimation from monocular images show significantly enhanced minimum exploration. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
02628856
Volume :
24
Issue :
3
Database :
Academic Search Index
Journal :
Image & Vision Computing
Publication Type :
Academic Journal
Accession number :
20184276
Full Text :
https://doi.org/10.1016/j.imavis.2005.07.022