201. A sparse quantized hopfield network for online-continual memory.
- Author
-
Alonso, Nicholas and Krichmar, Jeffrey L.
- Subjects
HOPFIELD networks ,ARTIFICIAL neural networks ,MACHINE learning ,EPISODIC memory ,NEUROPLASTICITY ,SYNAPSES - Abstract
An important difference between brains and deep neural networks is the way they learn. Nervous systems learn online where a stream of noisy data points are presented in a non-independent, identically distributed way. Further, synaptic plasticity in the brain depends only on information local to synapses. Deep networks, on the other hand, typically use non-local learning algorithms and are trained in an offline, non-noisy, independent, identically distributed setting. Understanding how neural networks learn under the same constraints as the brain is an open problem for neuroscience and neuromorphic computing. A standard approach to this problem has yet to be established. In this paper, we propose that discrete graphical models that learn via an online maximum a posteriori learning algorithm could provide such an approach. We implement this kind of model in a neural network called the Sparse Quantized Hopfield Network. We show our model outperforms state-of-the-art neural networks on associative memory tasks, outperforms these networks in online, continual settings, learns efficiently with noisy inputs, and is better than baselines on an episodic memory task. Brains and neuromorphic systems learn with local learning rules in online-continual learning scenarios. Designing neural networks that learn effectively under these conditions is challenging. The authors introduce a neural network that implements an effective, principled approach to local, online-continual learning on associative memory tasks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF