Back to Search Start Over

A sparse quantized hopfield network for online-continual memory

Authors :
Nicholas Alonso
Jeffrey L. Krichmar
Source :
Nature Communications, Vol 15, Iss 1, Pp 1-15 (2024)
Publication Year :
2024
Publisher :
Nature Portfolio, 2024.

Abstract

Abstract An important difference between brains and deep neural networks is the way they learn. Nervous systems learn online where a stream of noisy data points are presented in a non-independent, identically distributed way. Further, synaptic plasticity in the brain depends only on information local to synapses. Deep networks, on the other hand, typically use non-local learning algorithms and are trained in an offline, non-noisy, independent, identically distributed setting. Understanding how neural networks learn under the same constraints as the brain is an open problem for neuroscience and neuromorphic computing. A standard approach to this problem has yet to be established. In this paper, we propose that discrete graphical models that learn via an online maximum a posteriori learning algorithm could provide such an approach. We implement this kind of model in a neural network called the Sparse Quantized Hopfield Network. We show our model outperforms state-of-the-art neural networks on associative memory tasks, outperforms these networks in online, continual settings, learns efficiently with noisy inputs, and is better than baselines on an episodic memory task.

Subjects

Subjects :
Science

Details

Language :
English
ISSN :
20411723
Volume :
15
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Nature Communications
Publication Type :
Academic Journal
Accession number :
edsdoj.fbb9458be54d423f9e94a0c2eca4cd33
Document Type :
article
Full Text :
https://doi.org/10.1038/s41467-024-46976-4