Skip to main content

Monkeys and Humans Implement Causal Inference to Simultaneously Localize Auditory and Visual Stimuli

Publication ,  Journal Article
Mohl, J; Pearson, J; Groh, J
2019

The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accurately predicted both the explicit “same vs. different” source judgements as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to non-human primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli. We experience the world through multiple sensory systems, which interact to shape perception. To do so, the brain must first determine which pieces of sensory input arise from the same source and which have nothing to do with one another. To probe how the brain accomplishes this causal inference, we developed a naturalistic paradigm that provides a behavioral report both of the number of perceived stimuli and their locations. We tested performance on this task in both humans and monkeys, and we found that both species perform causal inference in a similar manner. By providing this cross-species comparison at the behavioral level, our paradigm lays the groundwork for future experiments using neuronal recording techniques that may be impractical or impossible in human subjects.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

DOI

Publication Date

2019
 

DOI

Publication Date

2019