Spatial Signatures of Learning in Self-Learning Networks

ORAL

Abstract

Recent works have demonstrated the ability of physical systems to be trained for functionality and developed general frameworks for learning in physical systems such as electrical, flow and mechanical networks. A great appeal of physical learning systems is their promise of greater interpretability. In such systems, learning and adaptation occur under physical constraints; this affects both the learning and physical Hessians. Learning encodes the functionality in the eigensystem of the system’s physical Hessian, softening low eigenmodes and aligning them with the desired response [1]. As in deep neural networks, learning raises the high eigenmodes of the learning Hessian [1]; those eigenmodes contain the same information as the low eigenmodes of the physical Hessian. We build on [1,2] and use the low eigenmodes of the physical Hessian to build an understanding of how self-learning electrical circuits learn various machine-learning tasks.

[1] Menachem Stern, Andrea J. Liu, Vijay Balasubramanian, The Physical Effects of Learning, arXiv:2306.12928

[2] Rocks, Jason W. and Liu, Andrea J. Liu and Katifori, Eleni, Hidden Topological Structure of Flow Network Functionality, Phys. Rev. Lett 126 2 028102

* FM aknowledges support from NSF grant #2152205, AL and MG aknowledge support from Simons Foundation Investigator grant #327939, and MG also aknowledges support from NSF-DMR-2005749.

Presenters

  • Felipe Martins

    University of Pennsylvania

Authors

  • Felipe Martins

    University of Pennsylvania

  • Marcelo Guzmán

    University of Pennsylvania, UPenn

  • Andrea J Liu

    University of Pennsylvania