Understanding the dynamical control of animal movement
COFFEE_KLATCH · Invited
Abstract
Over the last 50 years, neurophysiologists have described many neural circuits that transform sensory input into motor commands, while biomechanicians and behavioral biologists have described many patterns of animal movement that occur in response to sensory input. Attempts to link these two have been frustrated by our technical inability to record from the necessary neurons in a freely behaving animal. As a result, we don't know how these neural circuits function in the closed loop context of free behavior, where the sensory and motor context changes on a millisecond time-scale. To address this problem, we have developed a software package, AnimatLab (www.AnimatLab.com), that enables users to reconstruct an animal's body and its relevant neural circuits, to link them at the sensory and motor ends, and through simulation, to test their ability to reproduce appropriate patterns of the animal's movements in a simulated Newtonian world. A Windows-based program, AnimatLab consists of a neural editor, a body editor, a world editor, stimulus and recording facilities, neural and physics engines, and an interactive 3-D graphical display. We have used AnimatLab to study three patterns of behavior: the grasshopper jump, crayfish escape, and crayfish leg movements used in postural control, walking, reaching and grasping. In each instance, the simulation helped identify constraints on both nervous function and biomechanical performance that have provided the basis for new experiments. Colleagues elsewhere have begun to use AnimatLab to study control of paw movements in cats and postural control in humans. We have also used AnimatLab simulations to guide the development of an autonomous hexapod robot in which the neural control circuitry is downloaded to the robot from the test computer.
–
Authors
-
Donald Edwards
Georgia State University