Optimizing population coding with unimodal tuning curves and short-term memory in continuous attractor networks

ORAL

Abstract

A widely used tool for quantifying the precision with which a population of sensory neurons encodes the value of an external stimulus is the Fisher Information (FI). Maximizing FI is also a commonly used objective for constructing optimal neural codes. The primary utility and importance of FI arises from its relation to the mean-squared error of an unbiased stimulus estimator through the Cramér-Rao bound. However, it is well-known that when neural firing is sparse, optimizing FI can result in codes that perform very poorly when considering the resulting mean-squared error, a measure with direct biological relevance. Here we construct optimal population codes by minimizing mean-squared error directly and study the scaling properties of the resulting network, focusing on the optimal tuning curve width. We find that the error scales superlinearly with the system size, and this property remains robust in the presence of finite baseline firing. We then extend our results to continuous attractor networks that maintain short-term memory of external stimuli in their dynamics. Here we also find similar scaling properties in the structure of the interactions that minimize bump diffusivity.

Presenters

  • Hyun Jin Kim

    Northwestern University

Authors

  • Hyun Jin Kim

    Northwestern University

  • Ila Fiete

    University of Texas, Austin

  • David Schwab

    Institute for Theoretical Science, Graduate Center at the City University of New York, The Graduate Center, CUNY, Department of Biology, City University of New York