Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 1998 | public
Journal Article

Visual self-motion perception during head turns

Abstract

Extra-retinal information is critical in the interpretation of visual input during self-motion. Turning our eyes and head to track objects displaces the retinal image but does not affect our ability to navigate because we use extra-retinal information to compensate for these displacements. We showed observers animated displays depicting their forward motion through a scene. They perceived the simulated self-motion accurately while smoothly shifting the gaze by turning the head, but not when the same gaze shift was simulated in the display; this indicates that the visual system also uses extra-retinal information during head turns. Additional experiments compared self-motion judgments during active and passive head turns, passive rotations of the body and rotations of the body with head fixed in space. We found that accurate perception during active head turns is mediated by contributions from three extra-retinal cues: vestibular canal stimulation, neck proprioception and an efference copy of the motor command to turn the head.

Additional Information

© 1998 Nature America Inc. Received 14 September; Accepted 21 October 1998. Some of the software used to run these experiments was written by Payam Saisan, Karsten Weber and Kirk Swenson at U.C. Berkeley. Portions of the apparatus were designed and built by Ric Paniagua and John Klemic at Caltech and Dave Rehder, Payam Saisan and Larry Gibson at U.C. Berkeley. Administrative assistance was provided by Sylvie Gertmenian and Cierina Reyes at Caltech and May Wong at U.C. Berkeley. This research was supported by the Human Frontiers Program, NIH-NEI, ONR and the James G. Boswell Neuroscience Professorship.

Additional details

Created:
August 19, 2023
Modified:
October 23, 2023