Skip to main content

Jennifer M. Groh

Professor of Psychology and Neuroscience
Psychology & Neuroscience
Duke Box 90999, Durham, NC 27708-0999
LSRC B252, Durham, NC 27708

Selected Publications


Signal switching may enhance processing power of the brain.

Journal Article Trends in cognitive sciences · July 2024 Our ability to perceive multiple objects is mysterious. Sensory neurons are broadly tuned, producing potential overlap in the populations of neurons activated by each object in a scene. This overlap raises questions about how distinct information is retain ... Full text Cite

Multiple objects evoke fluctuating responses in several regions of the visual pathway.

Journal Article eLife · March 2024 How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (e.g., visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsiv ... Full text Cite

Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs).

Journal Article Hear Res · December 2023 We recently discovered a unique type of otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related eardrum oscilla ... Full text Link to item Cite

Parametric information about eye movements is sent to the ears.

Journal Article Proceedings of the National Academy of Sciences of the United States of America · November 2023 Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sou ... Full text Cite

Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys.

Journal Article Philosophical transactions of the Royal Society of London. Series B, Biological sciences · September 2023 Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory sp ... Full text Cite

Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs).

Journal Article bioRxiv · August 6, 2023 We recently discovered a unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements and occurring in the absence of external sound (Gruters et al., 2018). How and why these eye-movement-related e ... Full text Link to item Cite

Conserved features of eye movement related eardrum oscillations (EMREOs) across humans and monkeys.

Journal Article bioRxiv · May 22, 2023 Auditory and visual information involve different coordinate systems, with auditory spatial cues anchored to the head and visual spatial cues anchored to the eyes. Information about eye movements is therefore critical for reconciling visual and auditory sp ... Full text Link to item Cite

Coordinated multiplexing of information about separate objects in visual cortex.

Journal Article eLife · November 2022 Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than o ... Full text Open Access Cite

Multiple sounds degrade the frequency representation in monkey inferior colliculus.

Journal Article The European journal of neuroscience · January 2022 How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit ... Full text Cite

Visual Signals in the Mammalian Auditory System.

Journal Article Annual review of vision science · September 2021 Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, ... Full text Cite

Compensating for a shifting world: evolving reference frames of visual and auditory signals across three multimodal brain areas.

Journal Article Journal of neurophysiology · July 2021 Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the ... Full text Cite

ANALYZING SECOND ORDER STOCHASTICITY OF NEURAL SPIKING UNDER STIMULI-BUNDLE EXPOSURE.

Journal Article The annals of applied statistics · March 2021 Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneou ... Full text Cite

Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli.

Journal Article J Neurophysiol · September 1, 2020 The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior ... Full text Link to item Cite

Hearing in a world of light: why, where, and how visual and auditory information are connected by the brain.

Journal Article Journal of eye movement research · November 2019 Keynote by Jenny Groh (Duke University) at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote pres ... Full text Cite

Coordinated multiplexing of information about separate objects in visual cortex

Journal Article · September 23, 2019 AbstractSensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information abouteachof the stimuli that may be present ... Full text Cite

Hemisphere-specific properties of the ventriloquism aftereffect.

Journal Article The Journal of the Acoustical Society of America · August 2019 Visual calibration of auditory space requires re-alignment of representations differing in (1) format (auditory hemispheric channels vs visual maps) and (2) reference frames (head-centered vs eye-centered). Here, a ventriloquism paradigm from Kopčo, Lin, S ... Full text Cite

Single neurons may encode simultaneous stimuli by switching between activity patterns.

Journal Article Nature communications · July 2018 Featured Publication How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior collicu ... Full text Open Access Cite

The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing.

Journal Article Proceedings of the National Academy of Sciences of the United States of America · February 2018 Featured Publication Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in huma ... Full text Open Access Cite

Beyond the labeled line: variation in visual reference frames from intraparietal cortex to frontal eye fields and the superior colliculus

Journal Article Journal of Neurophysiology · December 20, 2017 Featured Publication We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. In this study, we assessed how this neural computation unfolds across three interconnected ... Full text Open Access Cite

Effects of Electrical Stimulation in the Inferior Colliculus on Frequency Discrimination by Rhesus Monkeys and Implications for the Auditory Midbrain Implant.

Journal Article J Neurosci · May 4, 2016 UNLABELLED: Understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achi ... Full text Open Access Link to item Cite

Similar prevalence and magnitude of auditory-evoked and visually-evoked activity in the frontal eye fields: Implications for multisensory motor control

Journal Article Journal of Neurophysiology · March 2, 2016 Saccadic eye movements can be elicited by more than one type of sensory stimulus. This implies substantial transformations of signals originating in different sense organs as they reach a common motor output pathway. In this study, we compared the preval ... Full text Open Access Cite

Making Space: How the Brain Knows Where Things Are

Book · November 5, 2014 Featured Publication Link to item Cite

Different Stimuli, Different Spatial Codes: A Visual Map and an Auditory Rate Code for Oculomotor Space in the Primate Superior Colliculus

Journal Article PLoS One · January 15, 2014 Featured Publication Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive ne ... Full text Open Access Cite

Subjective experience of sensation in anorexia nervosa.

Journal Article Behav Res Ther · June 2013 The nature of disturbance in body experience in anorexia nervosa (AN) remains poorly operationalized despite its prognostic significance. We examined the relationship of subjective reports of sensitivity to and behavioral avoidance of sensory experience (e ... Full text Open Access Link to item Cite

Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.

Journal Article PLoS One · 2013 Featured Publication A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visua ... Full text Open Access Link to item Cite

Sounds and beyond: Multisensory and other non-auditory signals in the inferior colliculus

Journal Article Frontiers in Neural Circuits · November 15, 2012 The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory s ... Full text Open Access Cite

Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.

Journal Article Journal of neurophysiology · July 2012 Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, ... Full text Open Access Cite

Distribution of eye position information in the monkey inferior colliculus.

Journal Article Journal of neurophysiology · February 2012 The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated wheth ... Full text Open Access Cite

Distribution of visual and saccade related information in the monkey inferior colliculus.

Journal Article Frontiers in neural circuits · January 2012 The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we m ... Full text Open Access Cite

Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus.

Journal Article Frontiers in neural circuits · January 2012 The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory s ... Full text Open Access Cite

Systematic mapping of the monkey inferior colliculus reveals enhanced low frequency sound representation.

Journal Article Journal of neurophysiology · April 2011 We investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed ... Full text Open Access Cite

Effects of Initial Eye Position on Saccades Evoked by Microstimulation in the Primate Superior Colliculus: Implications for Models of the SC Read-Out Process.

Journal Article Frontiers in integrative neuroscience · January 2011 The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In thi ... Full text Open Access Cite

Looking at Sounds: Neural Mechanisms in the Primate Brain

Chapter · February 1, 2010 When you hear a salient sound, it is natural to look at it to find out what is happening. Orienting the eyes to look at sounds is essential to our ability to identify and understand the events occurring in our environment. This behavior involves both senso ... Full text Cite

Comparison of gain-like properties of eye position signals in inferior colliculus versus auditory cortex of primates.

Journal Article Frontiers in integrative neuroscience · January 2010 We evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region ... Full text Open Access Cite

Multisensory guidance of orienting behavior.

Journal Article Hearing research · December 2009 We use both vision and audition when localizing objects and events in our environment. However, these sensory systems receive spatial information in different coordinate systems: sounds are localized using inter-aural and spectral cues, yielding a head-cen ... Full text Cite

Reference frame of the ventriloquism aftereffect.

Journal Article The Journal of neuroscience : the official journal of the Society for Neuroscience · November 2009 Seeing the image of a newscaster on a television set causes us to think that the sound coming from the loudspeaker is actually coming from the screen. How images capture sounds is mysterious because the brain uses different methods for determining the loca ... Full text Cite

Motor-related signals in the intraparietal cortex encode locations in a hybrid, rather than eye-centered reference frame.

Journal Article Cereb Cortex · August 2009 Featured Publication The reference frame used by intraparietal cortex neurons to encode locations is controversial. Many previous studies have suggested eye-centered coding, whereas we have reported that visual and auditory signals employ a hybrid reference frame (i.e., a comb ... Full text Link to item Cite

A rate code for sound azimuth in monkey auditory cortex: implications for human neuroimaging studies.

Journal Article J Neurosci · April 2, 2008 Featured Publication Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monk ... Full text Link to item Cite

Visual- and saccade-related signals in the primate inferior colliculus.

Journal Article Proc Natl Acad Sci U S A · November 6, 2007 Featured Publication The inferior colliculus (IC) is normally thought of as a predominantly auditory structure because of its early position in the ascending auditory pathway just before the auditory thalamus. Here, we show that a majority of IC neurons (64% of 180 neurons) in ... Full text Link to item Cite

Seeing sounds: visual and auditory interactions in the brain.

Journal Article Current opinion in neurobiology · August 2006 Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highl ... Full text Cite

Effects of reward and behavioral context on neural activity in the primate inferior colliculus.

Journal Article The Journal of neuroscience : the official journal of the Society for Neuroscience · July 2006 Neural activity in the inferior colliculus (IC) likely plays an integral role in the processing of various auditory parameters, such as sound location and frequency. However, little is known about the extent to which IC neural activity may be influenced by ... Full text Cite

Representation of eye position in primate inferior colliculus.

Journal Article Journal of neurophysiology · March 2006 We studied the representation of eye-position information in the primate inferior colliculus (IC). Monkeys fixated visual stimuli at one of eight or nine locations along the horizontal meridian between -24 and 24 degrees while sounds were presented from lo ... Full text Cite

Long lasting attenuation by prior sounds in auditory cortex of awake primates.

Journal Article Experimental brain research · January 2006 How the brain responds to sequences of sounds is a question of great relevance to a variety of auditory perceptual phenomena. We investigated how long the responses of neurons in the primary auditory cortex of awake monkeys are influenced by the previous s ... Full text Cite

The "other" transformation required for visual-auditory integration: representational format.

Journal Article Progress in brain research · January 2006 Multisensory integration of spatial signals requires not only that stimulus locations be encoded in the same spatial reference frame, but also that stimulus locations be encoded in the same representational format. Previous studies have addressed the issue ... Full text Cite

Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus.

Journal Article Journal of neurophysiology · October 2005 The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, ... Full text Cite

Auditory saccades from different eye positions in the monkey: implications for coordinate transformations.

Journal Article Journal of neurophysiology · October 2004 Auditory spatial information arises in a head-centered coordinate frame, whereas the saccade command signals generated by the superior colliculus (SC) are thought to specify target locations in an eye-centered frame. However, auditory activity in the SC ap ... Full text Cite

A monotonic code for sound azimuth in primate inferior colliculus.

Journal Article Journal of cognitive neuroscience · November 2003 We investigated the format of the code for sound location in the inferior colliculi of three awake monkeys (Macaca mulatta). We found that roughly half of our sample of 99 neurons was sensitive to the free-field locations of broadband noise presented in th ... Full text Cite

Eye position affects activity in primary auditory cortex of primates.

Journal Article Current biology : CB · April 2003 BackgroundNeurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual path ... Full text Cite

How the brain keeps time

Journal Article Daedalus · 2003 Featured Publication Cite

Representation of sound location in the primate brain

Chapter · January 1, 2002 Knowing where things are in space is essential to our existence. The visual, auditory, and cutaneous senses all contribute to the perception of stimulus location, but they acquire spatial information in radically different ways. Positional information is l ... Cite

Converting neural signals from place codes to rate codes.

Journal Article Biological cybernetics · September 2001 The nervous system uses two basic types of formats for encoding information. The parameters of many sensory (and some premotor) signals are represented by the pattern of activity among an array of neurons each of which is optimally responsive to a differen ... Full text Cite

Afferent delays and the mislocalization of perisaccadic stimuli.

Journal Article Vision research · September 2001 Determining the precise moment a visual stimulus appears is difficult because visual response latencies vary. This temporal uncertainty could cause localization errors to brief visual targets presented before and during eye movements if the oculomotor syst ... Full text Cite

Eye position influences auditory responses in primate inferior colliculus.

Journal Article Neuron · February 2001 We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appea ... Full text Cite

Segregation of object and background motion in visual area MT: effects of microstimulation on eye movements.

Journal Article Neuron · June 2000 To track a moving object, its motion must first be distinguished from that of the background. The center-surround properties of neurons in the middle temporal visual area (MT) may be important for signaling the relative motion between object and background ... Full text Cite

Predicting perception from population codes.

Journal Article Nature neuroscience · March 2000 Full text Cite

Reading neural representations.

Journal Article Neuron · October 1998 Full text Cite

Neurophysiology: electrically evoking sensory experience.

Journal Article Current biology : CB · June 1998 Monkeys trained to distinguish touch stimuli that 'flutter' with different frequencies can similarly distinguish electrical stimulation of the somatosensory cortex according to its frequency; the implication is that the electrically-evoked patterns of cort ... Full text Cite

Electrically evoking sensory experience

Journal Article Current Biology · 1998 Cite

How is a sensory map read Out? Effects of microstimulation in visual area MT on saccades and smooth pursuit eye movements.

Journal Article The Journal of neuroscience : the official journal of the Society for Neuroscience · June 1997 To generate behavioral responses based on sensory input, motor areas of the brain must interpret, or "read out," signals from sensory maps. Our experiments tested several algorithms for how the motor systems for smooth pursuit and saccadic eye movements mi ... Full text Cite

Neurophysiology: neural fingerprints of visual attention.

Journal Article Current biology : CB · November 1996 Pronounced effects of attention have been demonstrated in a region of visual cortex previously thought to be devoid of such influences; identifying the features critical for eliciting these effects should teach us a great deal about the neural underpinning ... Full text Cite

A comparison of the effects of microstimulation in area MT on saccades and smooth pursuit eye movements

Journal Article Investigative Ophthalmology and Visual Science · February 15, 1996 Purpose. Tracking a moving target that appears in the periphery involves two kinds of eye movements, saccades and smooth pursuit. Both kinds of movements require a signal of target velocity. The saccadic system must use a target velocity signal to compensa ... Cite

Saccades to somatosensory targets. I. behavioral characteristics.

Journal Article Journal of neurophysiology · January 1996 1. We compared the properties of saccades to somatosensory and visual targets. This comparison provides insight into the translation of sensory signals coding target location in different sensory coordinate frameworks into motor commands of a common format ... Full text Cite

Saccades to somatosensory targets. II. motor convergence in primate superior colliculus.

Journal Article Journal of neurophysiology · January 1996 1. We examined cells with saccade-related activity in the superior colliculus (SC) of monkeys performing saccades to both somatosensory and visual targets. Our goals were 1) to determine whether signals from these separate sensory systems have converged on ... Full text Cite

Saccades to somatosensory targets. III. eye-position-dependent somatosensory activity in primate superior colliculus.

Journal Article Journal of neurophysiology · January 1996 1. We recorded from cells with sensory responses to somatosensory stimuli in the superior colliculus (SC) of awake monkeys. Our goal was to determine the frame of reference of collicular somatosensory signals by seeing whether the positions of the eyes inf ... Full text Cite

Interpreting sensory maps in visual cortex

Journal Article IBRO News · 1996 Cite

Neural fingerprints of visual attention

Journal Article Current Biol. · 1996 Cite

Two models for transforming auditory signals from head-centered to eye-centered coordinates.

Journal Article Biological cybernetics · January 1992 Two models for transforming auditory signals from head-centered to eye-centered coordinates are presented. The vector subtraction model subtracts a rate-coded eye position signal from a topographically weighted auditory target position signal to produce a ... Full text Cite