Skip to main content

Forgetting leads to chaos in attractor networks

Publication ,  Journal Article
Pereira-Obilinovic, U; Aljadeff, J; Brunel, N
November 30, 2021

Attractor networks are an influential theory for memory storage in brain systems. This theory has recently been challenged by the observation of strong temporal variability in neuronal recordings during memory tasks. In this work, we study a sparsely connected attractor network where memories are learned according to a Hebbian synaptic plasticity rule. After recapitulating known results for the continuous, sparsely connected Hopfield model, we investigate a model in which new memories are learned continuously and old memories are forgotten, using an online synaptic plasticity rule. We show that for a forgetting time scale that optimizes storage capacity, the qualitative features of the network's memory retrieval dynamics are age-dependent: most recent memories are retrieved as fixed-point attractors while older memories are retrieved as chaotic attractors characterized by strong heterogeneity and temporal fluctuations. Therefore, fixed-point and chaotic attractors co-exist in the network phase space. The network presents a continuum of statistically distinguishable memory states, where chaotic fluctuations appear abruptly above a critical age and then increase gradually until the memory disappears. We develop a dynamical mean field theory (DMFT) to analyze the age-dependent dynamics and compare the theory with simulations of large networks. Our numerical simulations show that a high-degree of sparsity is necessary for the DMFT to accurately predict the network capacity. Finally, our theory provides specific predictions for delay response tasks with aging memoranda. Our theory of attractor networks that continuously learn new information at the price of forgetting old memories can account for the observed diversity of retrieval states in the cortex, and in particular the strong temporal fluctuations of cortical activity.

Duke Scholars

Publication Date

November 30, 2021
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Pereira-Obilinovic, U., Aljadeff, J., & Brunel, N. (2021). Forgetting leads to chaos in attractor networks.
Pereira-Obilinovic, Ulises, Johnatan Aljadeff, and Nicolas Brunel. “Forgetting leads to chaos in attractor networks,” November 30, 2021.
Pereira-Obilinovic U, Aljadeff J, Brunel N. Forgetting leads to chaos in attractor networks. 2021 Nov 30;
Pereira-Obilinovic, Ulises, et al. Forgetting leads to chaos in attractor networks. Nov. 2021.
Pereira-Obilinovic U, Aljadeff J, Brunel N. Forgetting leads to chaos in attractor networks. 2021 Nov 30;

Publication Date

November 30, 2021