Normalizing flows for uncertainty quantification in a Bayesian framework

ORAL · Invited

Abstract

Uncertainty quantification of theoretical models results in a posterior distribution of credible model parameters, and acquiring accurate statistics from such distributions is an essential building block for a model prediction of observations in future experiments. Posterior distributions can be continuously updated when new experimental data or theoretical model improvements become available, requiring new samplings of the model parameters at each update. Standard Monte Carlo methods, however, are typically inefficient for the high-dimensional distributions encountered in many theoretical models. Normalizing flow serves as a powerful tool to accelerate the sampling from these frequently-changing posterior distributions. A normalizing flow is a map that induces a non-trivial distribution function, such as posterior distributions, from the Gaussian distribution of the same dimension. Once such a normalizing flow is obtained, sampling of the model parameters can be done from the Gaussian distribution, followed by the application of normalizing flow. This sampling process can be done in parallel without any autocorrelations between samples, thus achieving a dramatic speed-up compared to the standard Monte Carlo methods. In this talk, we will discuss a method for training and updating normalizing flows for posteriors distributions via machine learning and demonstrate the method within covariant density functional theory.

Presenters

  • Yukari Yamauchi

    University of Washington, Institute for Nuclear Theory

Authors

  • Yukari Yamauchi

    University of Washington, Institute for Nuclear Theory

  • Kyle S Godbey

    Michigan State University

  • Pablo G Giuliani

    Facility for Rare Isotopes Beams

  • Landon Buskirk

    Michigan State University, Facility for Rare Isotope Beams