Multisensory guidance of orienting behavior.

Journal Article (Review;Journal Article)

We use both vision and audition when localizing objects and events in our environment. However, these sensory systems receive spatial information in different coordinate systems: sounds are localized using inter-aural and spectral cues, yielding a head-centered representation of space, whereas the visual system uses an eye-centered representation of space, based on the site of activation on the retina. In addition, the visual system employs a place-coded, retinotopic map of space, whereas the auditory system's representational format is characterized by broad spatial tuning and a lack of topographical organization. A common view is that the brain needs to reconcile these differences in order to control behavior, such as orienting gaze to the location of a sound source. To accomplish this, it seems that either auditory spatial information must be transformed from a head-centered rate code to an eye-centered map to match the frame of reference used by the visual system, or vice versa. Here, we review a number of studies that have focused on the neural basis underlying such transformations in the primate auditory system. Although, these studies have found some evidence for such transformations, many differences in the way the auditory and visual system encode space exist throughout the auditory pathway. We will review these differences at the neural level, and will discuss them in relation to differences in the way auditory and visual information is used in guiding orienting movements.

Full Text

Duke Authors

Cited Authors

  • Maier, JX; Groh, JM

Published Date

  • December 2009

Published In

Volume / Issue

  • 258 / 1-2

Start / End Page

  • 106 - 112

PubMed ID

  • 19520151

Pubmed Central ID

  • PMC2790026

Electronic International Standard Serial Number (EISSN)

  • 1878-5891

International Standard Serial Number (ISSN)

  • 0378-5955

Digital Object Identifier (DOI)

  • 10.1016/j.heares.2009.05.008


  • eng