Journal ArticleTrends in cognitive sciences · July 2024
Our ability to perceive multiple objects is mysterious. Sensory neurons are broadly tuned, producing potential overlap in the populations of neurons activated by each object in a scene. This overlap raises questions about how distinct information is retain ...
Full textCite
Journal ArticleeLife · March 2024
How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (e.g., visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsiv ...
Full textCite
Journal ArticleHear Res · December 2023
We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscilla ...
Full textLink to itemCite
Journal ArticleProceedings of the National Academy of Sciences of the United States of America · November 2023
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sou ...
Full textCite
Journal ArticlePhilosophical transactions of the Royal Society of London. Series B, Biological sciences · September 2023
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory sp ...
Full textCite
Journal ArticlebioRxiv · August 6, 2023
We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related e ...
Full textLink to itemCite
Journal ArticlebioRxiv · May 22, 2023
Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory sp ...
Full textLink to itemCite
Journal ArticleeLife · November 2022
Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than o ...
Full textOpen AccessCite
Journal ArticleThe European journal of neuroscience · January 2022
How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit ...
Full textCite
Journal ArticleAnnual review of vision science · September 2021
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, ...
Full textCite
Journal ArticleJournal of neurophysiology · July 2021
Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the ...
Full textCite
Journal ArticleThe annals of applied statistics · March 2021
Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneou ...
Full textCite
Journal ArticleJ Neurophysiol · September 1, 2020
The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior ...
Full textLink to itemCite
Journal ArticleJournal of eye movement research · November 2019
Keynote by Jenny Groh (Duke University) at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote pres ...
Full textCite
Journal Article · September 23, 2019
AbstractSensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information abouteachof the stimuli that may be present ...
Full textCite
Journal ArticleThe Journal of the Acoustical Society of America · August 2019
Visual calibration of auditory space requires re-alignment of representations differing in (1) format (auditory hemispheric channels vs visual maps) and (2) reference frames (head-centered vs eye-centered). Here, a ventriloquism paradigm from Kopčo, Lin, S ...
Full textCite
Journal ArticleNature communications · July 2018
Featured Publication
How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior collicu ...
Full textOpen AccessCite
Journal ArticleProceedings of the National Academy of Sciences of the United States of America · February 2018
Featured Publication
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in huma ...
Full textOpen AccessCite
Journal ArticleJournal of Neurophysiology · December 20, 2017
Featured Publication
We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected ...
Full textOpen AccessCite
Journal ArticleJ Neurosci · May 4, 2016
UNLABELLED: Understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achi ...
Full textOpen AccessLink to itemCite
Journal ArticleJournal of Neurophysiology · March 2, 2016
Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the preval ...
Full textOpen AccessCite
Journal ArticlePLoS One · January 15, 2014
Featured Publication
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive ne ...
Full textOpen AccessCite
Journal ArticleBehav Res Ther · June 2013
The nature of disturbance in body experience in anorexia nervosa (AN) remains poorly operationalized despite its prognostic significance. We examined the relationship of subjective reports of sensitivity to and behavioral avoidance of sensory experience (e ...
Full textOpen AccessLink to itemCite
Journal ArticlePLoS One · 2013
Featured Publication
A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visua ...
Full textOpen AccessLink to itemCite
Journal ArticleFrontiers in Neural Circuits · November 15, 2012
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory s ...
Full textOpen AccessCite
Journal ArticleJournal of neurophysiology · July 2012
Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, ...
Full textOpen AccessCite
Journal ArticleJournal of neurophysiology · February 2012
The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated wheth ...
Full textOpen AccessCite
Journal ArticleFrontiers in neural circuits · January 2012
The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we m ...
Full textOpen AccessCite
Journal ArticleFrontiers in neural circuits · January 2012
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory s ...
Full textOpen AccessCite
Journal ArticleJournal of neurophysiology · April 2011
We investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed ...
Full textOpen AccessCite
Journal ArticleFrontiers in integrative neuroscience · January 2011
The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In thi ...
Full textOpen AccessCite
Chapter · February 1, 2010
When you hear a salient sound, it is natural to look at it to find out what is happening. Orienting the eyes to look at sounds is essential to our ability to identify and understand the events occurring in our environment. This behavior involves both senso ...
Full textCite
Journal ArticleFrontiers in integrative neuroscience · January 2010
We evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region ...
Full textOpen AccessCite
Journal ArticleHearing research · December 2009
We use both vision and audition when localizing objects and events in our environment. However, these sensory systems receive spatial information in different coordinate systems: sounds are localized using inter-aural and spectral cues, yielding a head-cen ...
Full textCite
Journal ArticleThe Journal of neuroscience : the official journal of the Society for Neuroscience · November 2009
Seeing the image of a newscaster on a television set causes us to think that the sound coming from the loudspeaker is actually coming from the screen. How images capture sounds is mysterious because the brain uses different methods for determining the loca ...
Full textCite
Journal ArticleCereb Cortex · August 2009
Featured Publication
The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a comb ...
Full textLink to itemCite
Journal ArticleJ Neurosci · April 2, 2008
Featured Publication
Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monk ...
Full textLink to itemCite
Journal ArticleProc Natl Acad Sci U S A · November 6, 2007
Featured Publication
The inferior colliculus (IC) is normally thought of as a predominantly auditory structure because of its early position in the ascending auditory pathway just before the auditory thalamus. Here, we show that a majority of IC neurons (64% of 180 neurons) in ...
Full textLink to itemCite
Journal ArticleCurrent opinion in neurobiology · August 2006
Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highl ...
Full textCite
Journal ArticleThe Journal of neuroscience : the official journal of the Society for Neuroscience · July 2006
Neural activity in the inferior colliculus (IC) likely plays an integral role in the processing of various auditory parameters, such as sound location and frequency. However, little is known about the extent to which IC neural activity may be influenced by ...
Full textCite
Journal ArticleJournal of neurophysiology · March 2006
We studied the representation of eye-position information in the primate inferior colliculus (IC). Monkeys fixated visual stimuli at one of eight or nine locations along the horizontal meridian between -24 and 24 degrees while sounds were presented from lo ...
Full textCite
Journal ArticleExperimental brain research · January 2006
How the brain responds to sequences of sounds is a question of great relevance to a variety of auditory perceptual phenomena. We investigated how long the responses of neurons in the primary auditory cortex of awake monkeys are influenced by the previous s ...
Full textCite
Journal ArticleProgress in brain research · January 2006
Multisensory integration of spatial signals requires not only that stimulus locations be encoded in the same spatial reference frame, but also that stimulus locations be encoded in the same representational format. Previous studies have addressed the issue ...
Full textCite
Journal ArticleJournal of neurophysiology · October 2005
The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, ...
Full textCite
Journal ArticleJournal of neurophysiology · October 2004
Auditory spatial information arises in a head-centered coordinate frame, whereas the saccade command signals generated by the superior colliculus (SC) are thought to specify target locations in an eye-centered frame. However, auditory activity in the SC ap ...
Full textCite
Journal ArticleJournal of cognitive neuroscience · November 2003
We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in th ...
Full textCite
Journal ArticleCurrent biology : CB · April 2003
BackgroundNeurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual path ...
Full textCite
Chapter · January 1, 2002
Knowing where things are in space is essential to our existence. The visual, auditory, and cutaneous senses all contribute to the perception of stimulus location, but they acquire spatial information in radically different ways. Positional information is l ...
Cite
Journal ArticleBiological cybernetics · September 2001
The nervous system uses two basic types of formats for encoding information. The parameters of many sensory (and some premotor) signals are represented by the pattern of activity among an array of neurons each of which is optimally responsive to a differen ...
Full textCite
Journal ArticleVision research · September 2001
Determining the precise moment a visual stimulus appears is difficult because visual response latencies vary. This temporal uncertainty could cause localization errors to brief visual targets presented before and during eye movements if the oculomotor syst ...
Full textCite
Journal ArticleNeuron · February 2001
We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appea ...
Full textCite
Journal ArticleNeuron · June 2000
To track a moving object, its motion must first be distinguished from that of the background. The center-surround properties of neurons in the middle temporal visual area (MT) may be important for signaling the relative motion between object and background ...
Full textCite
Journal ArticleCurrent biology : CB · June 1998
Monkeys trained to distinguish touch stimuli that 'flutter' with different frequencies can similarly distinguish electrical stimulation of the somatosensory cortex according to its frequency; the implication is that the electrically-evoked patterns of cort ...
Full textCite
Journal ArticleThe Journal of neuroscience : the official journal of the Society for Neuroscience · June 1997
To generate behavioral responses based on sensory input, motor areas of the brain must interpret, or "read out," signals from sensory maps. Our experiments tested several algorithms for how the motor systems for smooth pursuit and saccadic eye movements mi ...
Full textCite
Journal ArticleCurrent biology : CB · November 1996
Pronounced effects of attention have been demonstrated in a region of visual cortex previously thought to be devoid of such influences; identifying the features critical for eliciting these effects should teach us a great deal about the neural underpinning ...
Full textCite
Journal ArticleInvestigative Ophthalmology and Visual Science · February 15, 1996
Purpose. Tracking a moving target that appears in the periphery involves two kinds of eye movements, saccades and smooth pursuit. Both kinds of movements require a signal of target velocity. The saccadic system must use a target velocity signal to compensa ...
Cite
Journal ArticleJournal of neurophysiology · January 1996
1. We compared the properties of saccades to somatosensory and visual targets. This comparison provides insight into the translation of sensory signals coding target location in different sensory coordinate frameworks into motor commands of a common format ...
Full textCite
Journal ArticleJournal of neurophysiology · January 1996
1. We examined cells with saccade-related activity in the superior colliculus (SC) of monkeys performing saccades to both somatosensory and visual targets. Our goals were 1) to determine whether signals from these separate sensory systems have converged on ...
Full textCite
Journal ArticleJournal of neurophysiology · January 1996
1. We recorded from cells with sensory responses to somatosensory stimuli in the superior colliculus (SC) of awake monkeys. Our goal was to determine the frame of reference of collicular somatosensory signals by seeing whether the positions of the eyes inf ...
Full textCite
Journal ArticleBiological cybernetics · January 1992
Two models for transforming auditory signals from head-centered to eye-centered coordinates are presented. The vector subtraction model subtracts a rate-coded eye position signal from a topographically weighted auditory target position signal to produce a ...
Full textCite