Vector averaging for smooth pursuit eye movements initiated by two moving targets in monkeys.
The visual input for pursuit eye movements is represented in the cerebral cortex as the distributed activity of neurons that are tuned for both the direction and speed of target motion. To probe how the motor system uses this distributed code to compute a command for smooth eye movements, we have recorded the initiation of pursuit for 150 msec presentations of two spots moving at different speeds and/or in different directions. With equal probability, one of the two spots continued to move at the same speed and in the same direction and became the tracking target, whereas the other disappeared and served as a distractor. We measured eye acceleration in the interval from 110 to 206 msec after the onset of spot motion, within both the open-loop interval for pursuit and the interval during which eye motion was affected by the two spots. Our results demonstrate that weighted vector averaging is used to combine the responses to two moving spots. We found only a minute number of responses that were consistent with either vector summation or winner-take-all computations. In addition, our data show that it is difficult for the monkey to defeat vector averaging without extended training on the use of an explicit cue about which spot will become the target. We argue that our experiment reveals the computations done by the pursuit system in the absence of attentional bias and that vector averaging is normally used to read the distributed code of image motion when there is only one target.
Lisberger, SG; Ferrera, VP
Volume / Issue
Start / End Page
Pubmed Central ID
International Standard Serial Number (ISSN)