Back to Search Start Over

Adversarially Training MCMC with Non-Volume-Preserving Flows.

Authors :
Liu, Shaofan
Sun, Shiliang
Source :
Entropy; Mar2022, Vol. 24 Issue 3, p415-415, 17p
Publication Year :
2022

Abstract

Recently, flow models parameterized by neural networks have been used to design efficient Markov chain Monte Carlo (MCMC) transition kernels. However, inefficient utilization of gradient information of the target distribution or the use of volume-preserving flows limits their performance in sampling from multi-modal target distributions. In this paper, we treat the training procedure of the parameterized transition kernels in a different manner and exploit a novel scheme to train MCMC transition kernels. We divide the training process of transition kernels into the exploration stage and training stage, which can make full use of the gradient information of the target distribution and the expressive power of deep neural networks. The transition kernels are constructed with non-volume-preserving flows and trained in an adversarial form. The proposed method achieves significant improvement in effective sample size and mixes quickly to the target distribution. Empirical results validate that the proposed method is able to achieve low autocorrelation of samples and fast convergence rates, and outperforms other state-of-the-art parameterized transition kernels in varieties of challenging analytically described distributions and real world datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
24
Issue :
3
Database :
Complementary Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
156002314
Full Text :
https://doi.org/10.3390/e24030415