Physics Informed Contrastive Learning for Homogeneous Partial Differential Equation Surrogate Modeling

ORAL

Abstract

Neural operators have recently grown in popularity as Partial Differential Equation (PDEs) surrogate models. Learning mappings between function spaces, rather than individual functions, has proven to be a powerful approach to calculate fast, accurate solutions to complex PDEs. While much work has been done evaluating neural operator performance on a wide variety of surrogate modeling tasks, these works normally evaluate performance on a single equation at a time. In this work, we develop a novel contrastive pretraining framework utilizing Generalized Contrastive Loss that improves neural operator generalization across multiple governing equations simultaneously. Governing equation coefficients are used to measure ground-truth similarity between systems that adds weighting to positive and negative samples. A combination of physics-informed system evolution and latent-space model output are anchored to input data and used in our distance function. We find that physics-informed contrastive pretraining improves both accuracy and generalization for the Fourier Neural Operator in fixed-future, and autoregressive rollout tasks for the 1D and 2D Heat, Burgers, and linear advection equations.

* This material is based upon work supported by the National Science Foundation under Grant No. 1953222

Publication: We plan on submitting this work for publication in a journal or conference after experiments are completed

Presenters

  • Cooper Lorsung

    Carnegie Mellon University

Authors

  • Cooper Lorsung

    Carnegie Mellon University

  • Amir Barati Farimani

    Carnegie Mellon University