Skip to main content
Journal cover image

Efficient supervised learning in networks with binary synapses.

Publication ,  Journal Article
Baldassi, C; Braunstein, A; Brunel, N; Zecchina, R
Published in: Proc Natl Acad Sci U S A
June 26, 2007

Recent experimental studies indicate that synaptic changes induced by neuronal activity are discrete jumps between a small number of stable states. Learning in systems with discrete synapses is known to be a computationally hard problem. Here, we study a neurobiologically plausible on-line learning algorithm that derives from belief propagation algorithms. We show that it performs remarkably well in a model neuron with binary synapses, and a finite number of "hidden" states per synapse, that has to learn a random classification task. Such a system is able to learn a number of associations close to the theoretical limit in time that is sublinear in system size. This is to our knowledge the first on-line algorithm that is able to achieve efficiently a finite number of patterns learned per binary synapse. Furthermore, we show that performance is optimal for a finite number of hidden states that becomes very small for sparse coding. The algorithm is similar to the standard "perceptron" learning algorithm, with an additional rule for synaptic transitions that occur only if a currently presented pattern is "barely correct." In this case, the synaptic changes are metaplastic only (change in hidden states and not in actual synaptic state), stabilizing the synapse in its current state. Finally, we show that a system with two visible states and K hidden states is much more robust to noise than a system with K visible states. We suggest that this rule is sufficiently simple to be easily implemented by neurobiological systems or in hardware.

Duke Scholars

Published In

Proc Natl Acad Sci U S A

DOI

EISSN

1091-6490

Publication Date

June 26, 2007

Volume

104

Issue

26

Start / End Page

11079 / 11084

Location

United States

Related Subject Headings

  • Synapses
  • Neural Networks, Computer
  • Nerve Net
  • Models, Neurological
  • Learning
  • Algorithms
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Baldassi, C., Braunstein, A., Brunel, N., & Zecchina, R. (2007). Efficient supervised learning in networks with binary synapses. Proc Natl Acad Sci U S A, 104(26), 11079–11084. https://doi.org/10.1073/pnas.0700324104
Baldassi, Carlo, Alfredo Braunstein, Nicolas Brunel, and Riccardo Zecchina. “Efficient supervised learning in networks with binary synapses.Proc Natl Acad Sci U S A 104, no. 26 (June 26, 2007): 11079–84. https://doi.org/10.1073/pnas.0700324104.
Baldassi C, Braunstein A, Brunel N, Zecchina R. Efficient supervised learning in networks with binary synapses. Proc Natl Acad Sci U S A. 2007 Jun 26;104(26):11079–84.
Baldassi, Carlo, et al. “Efficient supervised learning in networks with binary synapses.Proc Natl Acad Sci U S A, vol. 104, no. 26, June 2007, pp. 11079–84. Pubmed, doi:10.1073/pnas.0700324104.
Baldassi C, Braunstein A, Brunel N, Zecchina R. Efficient supervised learning in networks with binary synapses. Proc Natl Acad Sci U S A. 2007 Jun 26;104(26):11079–11084.
Journal cover image

Published In

Proc Natl Acad Sci U S A

DOI

EISSN

1091-6490

Publication Date

June 26, 2007

Volume

104

Issue

26

Start / End Page

11079 / 11084

Location

United States

Related Subject Headings

  • Synapses
  • Neural Networks, Computer
  • Nerve Net
  • Models, Neurological
  • Learning
  • Algorithms