Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network.

Published

Journal Article

We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number of stable states. Stimuli received by the network are drawn at random at each presentation from a set of classes. Each class is defined as a cluster in stimulus space, centred on the class prototype. The presentation protocol is chosen to mimic the protocols of visual memory experiments in which a set of stimuli is presented repeatedly in a random way. The statistics of the input stream may be stationary, or changing. Each stimulus induces, in a stochastic way, transitions between stable synaptic states. Learning dynamics is studied analytically in the slow learning limit, in which a given stimulus has to be presented many times before it is memorized, i.e. before synaptic modifications enable a pattern of activity correlated with the stimulus to become an attractor of the recurrent network. We show that in this limit the synaptic matrix becomes more correlated with the class prototypes than with any of the instances of the class. We also show that the number of classes that can be learned increases sharply when the coding level decreases, and determine the speeds of learning and forgetting of classes in the case of changes in the statistics of the input stream.

Full Text

Duke Authors

Cited Authors

  • Brunel, N; Carusi, F; Fusi, S

Published Date

  • February 1998

Published In

Volume / Issue

  • 9 / 1

Start / End Page

  • 123 - 152

PubMed ID

  • 9861982

Pubmed Central ID

  • 9861982

International Standard Serial Number (ISSN)

  • 0954-898X

Language

  • eng

Conference Location

  • England