8/21/2023 0 Comments Neural network model of memory‘Attractor network models’ are one type of mathematical model that neuroscientists use to represent how neurons communicate with each other to store memories. However, mathematical models of how past experiences are stored in our brains and retrieved when we remember them have so far focused on snapshot memories. For example, a mouse might remember how it went down a hole and found cheese there. If the memory includes physical movement, the sequence combines space and time to remember a trajectory. Instead, our memories tend to be dynamic: we remember how a sequence of events unfolded, and when we do this, we often re-experience at least part of that same sequence. When we recall a past experience, accessing what is known as an ‘episodic memory’, it usually does not appear as a still image or a snapshot of what occurred. By calculating the storage capacity, we show that the dynamic component does not impair memory capacity, and can even enhance it in certain regimes. The detailed analysis of the model with analytical calculations and numerical simulations shows that it can robustly retrieve multiple dynamical memories, and that this feature is largely independent of the details of its implementation. Here, we introduce a continuous attractor network model with a memory-dependent asymmetric component in the synaptic connectivity, which spontaneously breaks the equilibrium of the memory configurations and produces dynamic retrieval. However, most quantitative models of memory treat memories as static configurations, neglecting the temporal unfolding of the retrieval process. The phenomenon of replay, in the hippocampus of mammals, offers a remarkable example of this temporal dynamics. Episodic memory has a dynamic nature: when we recall past episodes, we retrieve not only their content, but also their temporal structure.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |