Extensive deep neural networks for 2d materials
ORAL
Abstract
We present a procedure for training and evaluating a deep neural network which can efficiently infer extensive parameters of arbitrarily large systems, doing so with O(N) complexity. We use a form of domain decomposition for training and inference, where each sub-domain (tile) is comprised of a non-overlapping focus region surrounded by an overlapping context region. The relative sizes of focus and context are physically motivated and depend on the locality length scale of the problem. Extensive deep neural networks (EDNN) are a formulation of convolutional neural networks which provide a flexible and general approach, based on physical constraints, to describe multi-scale interactions. They are well suited to massively parallel inference, as no inter-thread communication is necessary during evaluation. Example uses for graphene, hexagonal boron nitride (hBN), as well as their 2d alloys are demonstrated.
–
Presenters
-
Isaac Tamblyn
National Research Council of Canada
Authors
-
Iryna Luchak
University of British Columbia
-
Kyle Mills
University of Ontario
-
Kevin Ryczko
University of Ottawa
-
Adam Domurad
University of Waterloo
-
Christopher Beeler
University of Ontario
-
Isaac Tamblyn
National Research Council of Canada