Improved Training of Quantum Boltzmann Machines
ORAL
Abstract
Quantum Boltzmann machines (QBMs) are a natural quantum generalization of restricted Boltzmann machines (RBMs) that, at least under numerical simulation, perform better than their classical counterparts in learning generic data distributions. However, training QBMs using gradient-based methods requires sampling observables in quantum thermal distributions, a problem that generically is NP-hard. In this work, we find that the locality of the gradient observables that must be sampled gives rise to an efficient sampling method based on the Eigenstate Thermalization Hypothesis (ETH), and thus an efficient method for training QBMs on quantum devices. Furthermore, we demonstrate a hybrid gradient-based/black box optimization procedure that outperforms strictly gradient-based training methods.
–
Presenters
-
Eric Anschuetz
Zapata Computing
Authors
-
Eric Anschuetz
Zapata Computing
-
Yudong Cao
Zapata Computing, Zapata Computing, Inc.