Information Processing in the Somatosensory System
Invited
Abstract
Touch perception arises from a closed-loop process in which animals move their sensors in order to actively seek out informative mechanical input, which is then processed to direct further movements. My laboratory uses the mouse whisker system to explore the neural basis of active touch perception, at essentially all levels of the nervous system. Information processing during active touch is constrained by the coding properties of neurons in the skin that transduce mechanical stimuli into action potentials. I will discuss recent work from my laboratory that addresses how specific types of mechanosensory neurons encode (a) mechanical features of the environment during active touch, and (b) self-motion kinematics. These two streams of information allow tactile input to be interpreted with respect to sensor position. After mechanical features of the environment are encoded into action potentials and sent to the central nervous system, multiple factors determine how these neural signals are routed within circuits of the brain to impact perception. I will discuss a second line of work in my lab that addresses how identically encoded sensory inputs can produce quite different perceptual outcomes. By monitoring and manipulating neural activity at multiple levels of the nervous system during behavior, we have gained insight into the mapping between mechanosensation and perception.
–
Presenters
-
Daniel O'Connor
Department of Neuroscience, Johns Hopkins University School of Medicine
Authors
-
Daniel O'Connor
Department of Neuroscience, Johns Hopkins University School of Medicine