Training Machine Learning Emulators to Preserve Invariant Measures of Chaotic Attractors
ORAL · Invited
Abstract
Machine learning emulators are accelerating simulations of complex physical phenomena and helping create more accurate models by learning directly from data. These emulators enable faster predictions and uncertainty quantification, as well as new approaches for parameter estimation and other inverse problems. However, chaotic dynamics make long-horizon forecasts difficult because small perturbations in initial conditions cause trajectories to diverge at an exponential rate. In this setting, emulators trained to minimize squared error losses, while capable of accurate short-term forecasts, often fail to reproduce statistical or structural properties of the dynamics over longer time horizons and can yield degenerate results. We propose an alternative framework designed to preserve invariant measures of chaotic attractors that characterize the time-invariant statistical properties of the dynamics. Specifically, in the multi-environment setting (where each sample trajectory is governed by slightly different dynamics), we consider two novel approaches to training with noisy data. First, we propose a loss based on the optimal transport distance between the observed dynamics and the neural operator outputs. This approach requires expert knowledge of the underlying physics to determine what statistical features should be included in the optimal transport loss. Second, we show that a contrastive learning framework, which does not require any specialized prior knowledge, can preserve statistical properties of the dynamics nearly as well as the optimal transport approach. On a variety of chaotic systems, our method is shown empirically to preserve invariant measures of chaotic attractors.
*The authors gratefully acknowledge the support of DOE grant DE-SC0022232, AFOSR grant FA9550-18-1-0166, and NSF grants DMS-2023109 and DMS-1925101. In addition, Peter Y. Lu gratefully acknowledges the support of the Eric and Wendy Schmidt AI in Science Postdoctoral Fellowship, a Schmidt Futures program.
–
Publication:Accepted at NeurIPS 2023. (Preprint available: arXiv:2306.01187) R. Jiang, P. Y. Lu, E. Orlova, and R. Willett, Training Neural Operators to Preserve Invariant Measures of Chaotic Attractors. Advances in Neural Information Processing Systems, 2023.