Efficient Training of Neural-Network Interatomic Potentials with Atomic Forces
ORAL
Abstract
Neural-Network Interatomic Potentials have emerged as a promising method to accelerate time and length scales in Molecular Dynamics simulations of condensed systems. In the learning process, models are usually trained to a set of reference energies from electronic-structure calculations. In addition to total energies, local atomic forces are often also available by the Hellmann–Feynman theorem. Including atomic forces of quantum-mechanical accuracy in the training process can greatly enhance fidelity of the learned potential energy surface as it provides a wealth of additional information about its shape. However, this training scheme often comes at a much greater computational expense. In this talk, we address the speed and fidelity of integrating forces into the training process and examine strategies for rapid training of neural-network potentials for complex systems involving large training sets.
–
Presenters
-
Simon Batzner
Harvard University, John A. Paulson School of Engineering and Applied Sciences, Harvard University
Authors
-
Simon Batzner
Harvard University, John A. Paulson School of Engineering and Applied Sciences, Harvard University
-
Boris Kozinsky
Harvard University, John A. Paulson School of Engineering and Applied Sciences, Harvard University