Learning in a noisy environment: a Lyapunov equation approach

ORAL

Abstract

Consider a behavioral task described as a finite time trajectory through a $d$-dimensional space, segmented in $K$ time steps, and thus fully specified by a vector $X$ in the $n=dK$ dimensional state space of possible trajectories. Consider the dynamics of learning a desired target trajectory $X^{*}$. In the vicinity of $X^{*}$, the learning dynamics at the $t$-th discrete learning time step can be linearized to $Y_{t+1}=M Y_{t}+\xi_{t}$, where, $Y_{t}=X_{t}-X^{*}$ and $\xi$ is independent Gaussian noise of zero mean and covariance $\Delta$. The balance between contracting dynamics and noise leads to an asymptotic covariance $Q$ that obeys the Lyapunov equation $Q=M Q M^T+\Delta$. Given $Q$, how can the unknown deterministic component $M$ be estimated the presence of noise? We propose the use of systematic target perturbations $X^{*} \to X^{*}+\epsilon V_j$, with unit vectors $V_j$, $1 \le j \le n$ that span the space $X$. We argue, convincingly if not rigorously, that the linear response to these perturbations fully characterizes the asymptotic dynamics of the learning process. We illustrate the method by analyzing networks of neurons with either intrinsic or extrinsic noise, at time resolutions that span from spike timing to spiking rates.

Authors

  • Sara A Solla

    Northwestern University

  • Yarden Cohen

    Weizmann Institute of Science

  • Predrag Cvitanovic

    Georgia Institute of Technology