Parametrically managed activation function for fitting a neural network potential with physical behavior enforced by lower-level density functional theory calculation

ORAL

Abstract

Machine-learned potential energy surfaces (PES) are unreliable in regions with little or no training data. The goal of this work is to combine the efficiency of machine learning with human intelligence for regions where we do not need accurate data, e.g., large subsystem separations. We present a way to add a parametrically managed activation function (PMAF) to a neural network (NN) potential to ensure correct behavior at large and small distances. The NN that uses the PMAF is called parametrically managed NN (PMNN).The method involves two levels of electronic structure, a higher level (HL) and a lower level (LL). For the present tests, these are an accurate density functional method (CF22D/maug-cc-pVTZ) and an inexpensive density functional method (MPW1K/MIDIY). The goal is to reach HL accuracy for all geometries without making HL calculations in regions where the LL can guide the fit. Here, we consider the PES for dissociation of the S-H bond of o‑fluorothiophenol in the ground electronic state. We write the final potential EPMNN as [(EHL - ELL) + ELL], which is written as ([ΔEPMNN + ELL)]. The propagation from the input layer to the fourth hidden layer is done by GELU activation function, followed by PMAF at the sixth layer. PMAF damps ΔEPMNN to zero in asymptotic regions (where an atom dissociates) whenever the maximum of the 78 interatomic distances exceeds a cut-off distance. For small interatomic distances, the PMAF damps ΔEPMNN to zero whenever ELL is greater than a cut-off energy.

* U.S. Department of Energy, Office of Basic Energy Sciences, Award DE-SC0015997

Presenters

  • Suman Bhaumik

    University of Minnesota

Authors

  • Suman Bhaumik

    University of Minnesota

  • Yinan Shu

    University of Minnesota

  • Donald G Truhlar

    University of Minnesota