1. Variational neural annealing
- Author
-
Juan Carrasquilla, Roeland Wiersema, E. M. Inack, Roger G. Melko, and Mohamed Hibat-Allah
- Subjects
Spin glass ,Optimization problem ,Computer Networks and Communications ,Computer science ,Parameterized complexity ,Complex network ,01 natural sciences ,010305 fluids & plasmas ,Annealing (glass) ,Human-Computer Interaction ,Recurrent neural network ,Artificial Intelligence ,Variational principle ,0103 physical sciences ,Simulated annealing ,Computer Vision and Pattern Recognition ,010306 general physics ,Algorithm ,Software - Abstract
Many important challenges in science and technology can be cast as optimization problems. When viewed in a statistical physics framework, these can be tackled by simulated annealing, where a gradual cooling procedure helps search for ground-state solutions of a target Hamiltonian. Although powerful, simulated annealing is known to have prohibitively slow sampling dynamics when the optimization landscape is rough or glassy. Here we show that, by generalizing the target distribution with a parameterized model, an analogous annealing framework based on the variational principle can be used to search for ground-state solutions. Modern autoregressive models such as recurrent neural networks provide ideal parameterizations because they can be sampled exactly without slow dynamics, even when the model encodes a rough landscape. We implement this procedure in the classical and quantum settings on several prototypical spin glass Hamiltonians and find that, on average, it substantially outperforms traditional simulated annealing in the asymptotic limit, illustrating the potential power of this yet unexplored route to optimization. Optimization problems can be described in terms of a statistical physics framework. This offers the possibility to make use of ‘simulated annealing’, which is a procedure to search for a target solution similar to the gradual cooling of a condensed matter system to its ground state. The approach can now be sped up significantly by implementing a model of recurrent neural networks, in a new strategy called variational neural annealing.
- Published
- 2021
- Full Text
- View/download PDF