1. Flexible cognition in context-modulated reservoir networks
- Author
-
Nicolas Y. Masse, Matthew C. Rosen, Doris Y. Tsao, and David J. Freedman
- Abstract
The brains of all animals are plastic, allowing us to form new memories, adapt to new environments, and to learn new tasks. What is less clear is how much plasticity is required to perform these cognitive functions: does learning require widespread plasticity across the brain, or can learning occur with more rigid networks, in which plasticity is highly localized? Here, we use biologically-inspired recurrent neural network (RNN) models to show that rapid multitask learning can be accomplished in reservoir-style networks, in which synaptic plasticity is sparse and highly localized. Crucially, only RNNs initialized with highly specific combinations of network properties, such as topology, normalization and reciprocal connection strength, are capable of such learning. Finally, we show that this rapid learning with localized plasticity can be accomplished with purely local error signals, without backpropagation, using a reinforcement learning setup. This work suggests that rapid learning in artificial (and potentially biological) agents can be accomplished with mostly-rigid networks, in which synaptic plasticity is highly constrained.
- Published
- 2022
- Full Text
- View/download PDF