Efficient Sampling of Equilibrium Distributions with Energy-Informed, Scalable Diffusion Models
ORAL
Abstract
Many problems in statistical mechanics involve inference in high-dimensional spaces, where sampling techniques such as Markov Chain Monte Carlo (MCMC) frequently suffer from slow decorrelation and computational bottlenecks. To enhance sampling efficiency,MCMC can be augmented with deep generative models, a class of machine learning models that can learn and generate samples from desired probability distributions. We demonstrate the potential of integrating force field information into a specific type of deep generative model known as score-based diffusion models, with practical applications demonstrated in the generation of Lennard-Jones liquid configurations. This approach also gives rise to the possibiltiy of conditional generation, allowing particles to be generated based on their chemical environment. We aim to achieve scalability and generalizability in our sampling methods through the use of physics-informed generative models.
–
Presenters
-
Sherry Li
Stanford University
Authors
-
Sherry Li
Stanford University
-
Grant M Rotskoff
Stanford University, Stanford Univ