Transformer Quantum States: Principles, Sampling, and Constraints

POSTER

Abstract

Neural quantum states (NQS) approximate many-body wavefunctions by optimizing a parametrized ansatz with variational Monte Carlo. Among NQS, transformer architectures are particularly promising because they capture long-range dependencies and demonstrate an ability to generalize skills they are optimized for well outside of their initial training curricula. We aim to provide an overview of machine learning principles and sub-problems that appear in the application of transformer architectures to problems in condensed matter theory. These areas include a framework for representing computation graphs that involve the flow of randomly-sampled variables, notes regarding token-sampling routines and potential sampling obstacles, encoding schemes necessary to translate intermediate transformer data into forms necessary for representing sites with non-trivial physical constraints, and considerations for developing neural networks in a manner that surfaces diagnostics about the optimization process.

Presenters

  • Spandan J Suthar

    California Polytechnic State University, San Luis Obispo

Authors

  • Spandan J Suthar

    California Polytechnic State University, San Luis Obispo