The Restricted Kirchhoff Machine: self-learning electrical circuits for unsupervised learning

ORAL

Abstract

Recent self-learning electrical circuits have demonstrated that supervised learning can be achieved in analog circuits without a processor or external memory, by using local learning rules. However, much of learning in the brain is unsupervised.

Here we introduce the Restricted Kirchhoff Machine (RKM), an electrical self-learning circuit with variable resistors that employs local rules to infer patterns from data.

RKMs learn by modifying the curvature of the power landscape, encoding the patterns into the constrained ground states of the circuit.

Unlike traditional Restricted Boltzmann Machines, RKMs, when implemented in the lab, should enjoy a very fast equilibration time and be implementable at large scales as neuromorphic learning chips.

To demonstrate their effectiveness, we simulate their training on standard datasets and show that RKMs are comparable to traditional artificial neural networks in error.

Finally, we delve into experimental considerations and provide some back-of-the-envelope calculations assessing the time and energy consumption of these machines.

* This research is supported by the National Science Foundation via grant DMR-2005749 (M.G.) and the Simons Foundation via Investigator grant 327939 (M.G. and A.J.L.)

Presenters

  • Marcelo Guzmán

    University of Pennsylvania, UPenn

Authors

  • Marcelo Guzmán

    University of Pennsylvania, UPenn

  • Simone Ciarella

    Netherlands eScience Center

  • Andrea J Liu

    University of Pennsylvania