Generation of scale-invariant sequential activity in recurrent neural circuits

ORAL

Abstract

Sequential neural activity has been observed in many parts of the brain. Sequential activity can be generated by recurrent neural networks, which have been extensively studied (White and Sompolinsky, 1996, Goldman, 2009, Rajan et al., 2016, Chaudhuri et al., 2016).
Inspired by Weber-Fechner law in the sensory domain and scalar timing in psychophysics, we study the requirements for scale-invariant sequences in linear recurrent neural networks. It is found that 1) the eigenvalues of the connectivity matrix must be geometrically spaced, and 2) the eigenvectors have to have a translation-invariant structure composed of the same motif.
Together the relationship between the eigenvectors and eigenvalues can be thought of a logarithmic mapping from a ratio scale to an interval scale (Luce, 1959). Geometrically spaced network eigenvalues can generically emerge from multiplicative cellular processes (Amir et al., 2012). We are now seeking to extend these results to nonlinear recurrent networks.

Presenters

  • Yue Liu

    Department of Physics and Center for Systems Neuroscience, Boston University

Authors

  • Yue Liu

    Department of Physics and Center for Systems Neuroscience, Boston University

  • Marc W Howard

    Department of Physics and Center for Systems Neuroscience, Boston University