Minimax entropy principle and dimensionality reduction in neural data
ORAL
Abstract
Recent experimental breakthroughs have paved the way for collecting "big" neural data sets through the simultaneous recording of the activity in thousands of neurons. Capturing the fundamental principles governing collective neural activity presents an appealing and timely challenge for theorists. Statistical physics offers robust tools for addressing this challenge via simple models, such as maximum entropy models that match pairwise correlations, which have shown remarkable predictive accuracy for populations of N~100 neurons. But these methods are hindered by under-sampling at larger N. To address this issue, we employ the recently rediscovered "minimax entropy" principle [arXiv:2310.10860] to extract a low-dimensional representation of the neural population activity, providing a principled way to reduce the number of model parameters to a range that matches the number of available samples. We then propose two alternative techniques for solving the associated inverse problem: a mean-field analysis and an expansion in the limit of low correlations.
–
Presenters
-
Luca Di Carlo
Princeton university
Authors
-
Luca Di Carlo
Princeton university
-
Francesca Mignacco
Princeton University
-
William S Bialek
Princeton University