Continuous Variable Quantum Boltzmann Machine
ORAL
Abstract
Boltzmann Machines (BMs) are machine learning models which offer powerful framework for modelling probability distributions. In BMs, the probability distribution of the data is approximated based on a finite set of samples. After a successful training process, the learned distribution resembles to the actual distribution of the data such that it can make correct predictions about unseen instances. However, the generalization of the model suffers from growing number of parameters and training of a classical BM can become impractical. For these reasons, quantum Boltzmann machines (QBMs) have been proposed. However, the QBM models developed so far utilize the discrete-variable quantum computing model (based on qubits) and this framework is only partially suited for continuous valued data. It is more natural to extend the QBM model to continuous variable quantum computing (CVQC) model to study the continuous data. In this study, we propose a CV QBM that utilizes previously developed CV quantum neural network that defines quantum circuit for Gibbs state preparation and the parameters to be trained. We also discuss how to implement this model to classification and synthetic data generation problems.
* This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Science, under Award Number DE-SC002432 and National Science Foundation under award DGE-2152168.©2023 The MITRE Corporation. ALL RIGHTS RESERVED.Approved for public release. Distribution unlimited 21-03848-6.
–
Presenters
-
Kubra Yeter Aydeniz
Mitre Corp
Authors
-
Kubra Yeter Aydeniz
Mitre Corp
-
George Siopsis
University of Tennessee
-
Shikha Bangar
University of Tennessee
-
Leanto Sunny
The University of Tennessee, University of Tennessee, Knoxville