Natural Language Processing for Material Properties Prediction

ORAL

Abstract

Transformer based models have risen in popularity in natural language processing (NLP) due to its ability to contextualize words by capturing their correlations through the attention mechanism. Recognizing correlations between atoms is crucial to determining material properties especially when long-ranged interactions are present; however, many graph-based neural networks built on node-edge representations either rely on a large cutoff radius or an increased number of layers to accommodate long-distance correlations. In this talk, we introduce a transformer-based neural network model for representing crystalline materials. We explored its capacity for capturing long-range interactions through unsupervised pre-training. We further examined the performance of the pretrained model in some downstream property prediction tasks. Our work paves the way towards more computationally efficient architectures for material predictions.

* Fellowships: Center for Condensed Matter Sciences Undergraduate Fellowship and William Oegerle Scholarship in Physics and Astronomy.This work is supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences under Award No. DE-SC0022216 and by the Laboratory Directed Research and Development program at SLAC National Accelerator Laboratory, under contract DE-AC02-76SF00515.

Presenters

  • Nhat Huy M Tran

    University of Florida

Authors

  • Nhat Huy M Tran

    University of Florida

  • Momei Fang

    University of California San Diego

  • Fangze Liu

    Stanford University

  • Zhantao Chen

    SLAC National Laboratory

  • Chunjing Jia

    University of Florida