Temperature Optimization for Parallel Tempering in Neural Networks
ORAL
Abstract
We benchmark a parallel tempering method for the variational optimization of neural networks to approximate ground states of many-body quantum systems. We study the role played by two parameters in this method that play the role of temperature in the standard parallel tempering approach. The first is the temperature associated with the entropy term added to the local energy samples, which acts to flatten the energy landscape. The second is the temperature in the swap rule used to determine whether or not two neighboring replicas exchange their neural network configurations. We explore the effect of optimizing or fixing these two different temperatures on the performance of the variational algorithm for approximating the ground state of the two-dimensional J1-J2 model.
* This work is supported in part by the National Science Foundation under Grant No. 2037755. SNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525.
–
Presenters
-
Conor Smith
University of New Mexico
Authors
-
Conor Smith
University of New Mexico
-
Tameem Albash
University of New Mexico
-
Quinn Campbell
Sandia National Laboratories
-
Andrew D Baczewski
Sandia National Laboratories