Neural processing and computation in the visual system
Invited
Abstract
Visual systems detect many features of natural scenes, but one of the best studied is how they detect motion, which is a widespread computation across animals. To estimate visual motion speed and direction, the visual system must integrate information nonlinearly over space and over time. Models to estimate visual motion have two essential features: a delay step that delays certain signals with respect to others; and a nonlinear step that integrates signals over space and time to create the motion estimate. The computation itself can be framed as an inference problem, in which spatiotemporal light intensity measurements are combined to estimate a latent variable of image velocity. I will present recent work on understanding how motion is computed in the small brain of the fruit fly Drosophila, where genetic tools allow us to dissect the roles of individual neurons in circuit computations. I will focus on the mathematical operations that describe the transformation from light intensity to motion signals, and how that algorithm performs with different visual inputs. I will also focus on processing steps that appear similar across visual systems. These parallels suggest that there may be a narrow range of motion estimation algorithms that perform well given the constraints of biological systems and the regularities of natural scenes.
–
Presenters
-
Damon Clark
Molecular, Cellular, Developmental Biology, Yale University
Authors
-
Damon Clark
Molecular, Cellular, Developmental Biology, Yale University