1. Schema formation in a neural population subspace underlies learning-to-learn in flexible sensorimotor problem-solving
- Author
-
Elizabeth A. Buffalo, Barbara Peysakhovich, Vishwa Goudar, Xiao Jing Wang, and David J. Freedman
- Subjects
education.field_of_study ,Speedup ,Neural substrate ,Computer science ,business.industry ,Mechanism (biology) ,General Neuroscience ,Population ,Representation (systemics) ,Knowledge acquisition ,Schema (genetic algorithms) ,Artificial intelligence ,education ,business ,Subspace topology - Abstract
Learning-to-learn, a progressive speedup of learning while solving a series of similar problems, represents a core process of knowledge acquisition that draws attention in both neuroscience and artificial intelligence. To investigate its underlying brain mechanism, we trained a recurrent neural network model on arbitrary sensorimotor mappings known to depend on the prefrontal cortex. The network displayed an exponential time course of accelerated learning. The neural substrate of a schema emerges within a low-dimensional subspace of population activity; its reuse in new problems facilitates learning by limiting connection weight changes. Our work highlights the weight-driven modifications of the vector field, which determines the population trajectory of a recurrent network and behavior. Such plasticity is especially important for preserving and reusing the learnt schema in spite of undesirable changes of the vector field due to the transition to learning a new problem; the accumulated changes across problems account for the learning-to-learn dynamics.
- Published
- 2023