ParaMonte: A user-friendly parallel Monte Carlo optimization, sampling, and integration library for scientific inference
POSTER
Abstract
At the foundation of predictive science lies the scientific methodology, which involves multiple steps of observational data collection, developing testable hypotheses, and making predictions. Once a scientific theory is developed, it can be cast into a mathematical model whose parameters have to be fit via observational data. This leads to the formulation of a mathematical objective function for the problem at hand, which has to be then optimized to find the best-fit parameters of the model, or sampled to quantify the uncertainties associated with the parameters, or integrated to assess the performance of the model.
Toward this goal, a highly customizable, user-friendly high-performance parallel Monte Carlo optimizer, sampler, and integrator library is presented here, which can be used on a variety of platforms with single to many-core processors, with interfaces to popular programming languages including Python, R, MATLAB, Fortran, C/C++. The algorithms implemented in the library include variants of Markov Chain Monte Carlo that utilize Machine Learning techniques to improve the algorithm's performance, as well as Parallel Tempering, and Nested Sampling.
Toward this goal, a highly customizable, user-friendly high-performance parallel Monte Carlo optimizer, sampler, and integrator library is presented here, which can be used on a variety of platforms with single to many-core processors, with interfaces to popular programming languages including Python, R, MATLAB, Fortran, C/C++. The algorithms implemented in the library include variants of Markov Chain Monte Carlo that utilize Machine Learning techniques to improve the algorithm's performance, as well as Parallel Tempering, and Nested Sampling.
Presenters
-
Amir Shahmoradi
University of Texas at Arlington
Authors
-
Amir Shahmoradi
University of Texas at Arlington