Storage capacity of neural networks: Effect of the fluctuations of the number of active neurons per memory
The storage capacity in an attractor neural network with excitatory couplings is shown to depend not only on the fraction of active neurons per pattern (or coding rate), but also on the fluctuations around this value, in the thermodynamical limit. The capacity is calculated in the case of exactly the same number of active neurons in every pattern. For every coding level the capacity is increased with respect to the case of random patterns. Results are supported by numerical simulations done with an exhaustive search algorithm, and partly solve in the sparse coding limit the paradox of the discrepancy of the capacity of the Willshaw model with optimal capacity.
Volume / Issue
Start / End Page
International Standard Serial Number (ISSN)
Digital Object Identifier (DOI)