Improving training of Boltzmann machines with error corrected quantum annealing

ORAL

Abstract

Boltzmann machines, a class of machine learning models, are the basis of several deep learning methods that have been successfully applied to both supervised and unsupervised machine learning tasks. Quantum annealing may help lead to future advances in the development of these learning algorithms, but its usefulness is determined in part by the effective temperature. We have applied nested quantum annealing correction (NQAC) to do unsupervised learning with a small bars and stripes (BAS) dataset, and to a coarse-grained MNIST dataset, which consists of black-and-white images of hand-written integers, to do supervised learning. For both datasets, we demonstrate an effective temperature reduction with NQAC that leads to an increase in learning performance. We also find better performance overall with longer annealing times and offer some interpretation of the results based on comparison to simulated quantum annealing (SQA) simulations.

Presenters

  • Richard Li

    University of Southern California

Authors

  • Richard Li

    University of Southern California

  • Daniel A Lidar

    University of Southern California, Univ of Southern California