Tensor Network Machine Learning Models

Invited

Abstract

Tensor networks are an efficient approach to representing complicated many-body wavefunctions in terms of many smaller tensors, and they lead to powerful algorithms for studying strongly correlated systems. But tensor networks could be applied much more broadly than just for representing wavefunctions. Large tensors similar to wavefunctions appear naturally in certain classes of models studied extensively in machine learning. Decomposing the model parameters as a tensor network leads to interesting algorithms for training models on real-world data which scale better than existing approaches. In addition to training models directly for recognizing labeled data, tensor network real-space renormalization approaches can be used to extract statistically significant "features" for subsequent learning tasks. I will also highlight other benefits of the tensor network approach such as the flexibility to blend different approaches and to interpret trained models.

Presenters

  • Edwin Stoudenmire

    Center for Computational Quantum Physics, Flatiron Institute, University of California - Irvine, Department of Physics and Astronomy, University of California at Irvine

Authors

  • Edwin Stoudenmire

    Center for Computational Quantum Physics, Flatiron Institute, University of California - Irvine, Department of Physics and Astronomy, University of California at Irvine