ML Gradients in Molecular Simulations
ORAL · Invited
Abstract
The success of deep learning is predicated on differentiable programming and gradient-based optimization. In scientific applications, merging machine learning models and physics-based simulators is particularly compelling. ML surrogates can replace expensive simulators and physics-derived concepts and invariances add inductive bias to otherwise black-box models. Here, we will describe research examples of exploiting ML surrogate functions, and in particular their gradients, accessed through differentiable programming in molecular simulations. Applications include active learning of machine learning potentials for ground and excited states with differentiable uncertainty, and learning of data-driven collective variables for enhanced sampling simulations.
–
Presenters
-
Rafael Gomez-Bombarelli
MIT
Authors
-
Rafael Gomez-Bombarelli
MIT