Skip to main content

Storage capacity of networks with discrete synapses and sparsely encoded memories.

Publication ,  Journal Article
Feng, Y; Brunel, N
Published in: Phys Rev E
May 2022

Attractor neural networks are one of the leading theoretical frameworks for the formation and retrieval of memories in networks of biological neurons. In this framework, a pattern imposed by external inputs to the network is said to be learned when this pattern becomes a fixed point attractor of the network dynamics. The storage capacity is the maximum number of patterns that can be learned by the network. In this paper, we study the storage capacity of fully connected and sparsely connected networks with a binarized Hebbian rule, for arbitrary coding levels. Our results show that a network with discrete synapses has a similar storage capacity as the model with continuous synapses, and that this capacity tends asymptotically towards the optimal capacity, in the space of all possible binary connectivity matrices, in the sparse coding limit. We also derive finite coding level corrections for the asymptotic solution in the sparse coding limit. The result indicates the capacity of networks with Hebbian learning rules converges to the optimal capacity extremely slowly when the coding level becomes small. Our results also show that in networks with sparse binary connectivity matrices, the information capacity per synapse is larger than in the fully connected case, and thus such networks store information more efficiently.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

Published In

Phys Rev E

DOI

EISSN

2470-0053

Publication Date

May 2022

Volume

105

Issue

5-1

Start / End Page

054408

Location

United States

Related Subject Headings

  • 51 Physical sciences
  • 49 Mathematical sciences
  • 40 Engineering
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Feng, Y., & Brunel, N. (2022). Storage capacity of networks with discrete synapses and sparsely encoded memories. Phys Rev E, 105(5–1), 054408. https://doi.org/10.1103/PhysRevE.105.054408
Feng, Yu, and Nicolas Brunel. “Storage capacity of networks with discrete synapses and sparsely encoded memories.Phys Rev E 105, no. 5–1 (May 2022): 054408. https://doi.org/10.1103/PhysRevE.105.054408.
Feng Y, Brunel N. Storage capacity of networks with discrete synapses and sparsely encoded memories. Phys Rev E. 2022 May;105(5–1):054408.
Feng, Yu, and Nicolas Brunel. “Storage capacity of networks with discrete synapses and sparsely encoded memories.Phys Rev E, vol. 105, no. 5–1, May 2022, p. 054408. Pubmed, doi:10.1103/PhysRevE.105.054408.
Feng Y, Brunel N. Storage capacity of networks with discrete synapses and sparsely encoded memories. Phys Rev E. 2022 May;105(5–1):054408.

Published In

Phys Rev E

DOI

EISSN

2470-0053

Publication Date

May 2022

Volume

105

Issue

5-1

Start / End Page

054408

Location

United States

Related Subject Headings

  • 51 Physical sciences
  • 49 Mathematical sciences
  • 40 Engineering