Deep Learning using Quantum Stochastic Switches

ORAL

Abstract

Machine learning algorithms have demonstrated a significant and growing technological impact. A learning machine, by contrast with a machine learning algorithm, is an analogue physical system that learns. The study of physical learning machines may provide insight into the dynamics and ultimate constraints of learning, while also suggesting pathways to energy-efficient implementations of learning. For machine learning algorithms the cost function is set by the programmer while in physical learning machines there is a thermodynamic cost; ultimately this must constrain the learning rate. We consider learning in neural networks where the neurons are implemented as physical stochastic switches. The dynamics of the neuron weights and biases are described in continuous-time by stochastic differential equations. The learning (that is, weight and bias update) is implemented by feeding back the labelled training data and time-averaged switch output onto the drift term in the weight and bias dynamics. First, the dynamics of learning for single-input logic gates is studied numerically and analytically. The effects of estimating the time-average switch output from noisy observations using a causal Wiener filter are considered. The dynamics of learning for two-input logic gates using a neural network with a hidden layer is studied, as an entry to deep learning in physical neural networks using stochastic switches.

* MJW and GJM acknowledge support from the ARC Centre of Excellence for Engineered Quantum Systems (CE170100009).

Presenters

  • Matthew J Woolley

    University of New South Wales

Authors

  • Matthew J Woolley

    University of New South Wales

  • Ethan P Sigler

    University of New South Wales

  • Gerard J Milburn

    Univ of Queensland