Back to Search Start Over

The Wang-Landau Algorithm as Stochastic Optimization and Its Acceleration

Authors :
Dai, Chenguang
Liu, Jun S.
Source :
Phys. Rev. E 101, 033301 (2020)
Publication Year :
2019

Abstract

We show that the Wang-Landau algorithm can be formulated as a stochastic gradient descent algorithm minimizing a smooth and convex objective function, of which the gradient is estimated using Markov chain Monte Carlo iterations. The optimization formulation provides us a new way to establish the convergence rate of the Wang-Landau algorithm, by exploiting the fact that almost surely, the density estimates (on the logarithmic scale) remain in a compact set, upon which the objective function is strongly convex. The optimization viewpoint motivates us to improve the efficiency of the Wang-Landau algorithm using popular tools including the momentum method and the adaptive learning rate method. We demonstrate the accelerated Wang-Landau algorithm on a two-dimensional Ising model and a two-dimensional ten-state Potts model.<br />Comment: 10 pages, 3 figures

Details

Database :
arXiv
Journal :
Phys. Rev. E 101, 033301 (2020)
Publication Type :
Report
Accession number :
edsarx.1907.11985
Document Type :
Working Paper
Full Text :
https://doi.org/10.1103/PhysRevE.101.033301