1. Synaptic plasticity as Bayesian inference
- Author
-
Laurence Aitchison, Peter E. Latham, Jean-Pascal Pfister, Jannes Jegminat, Jorge Aurelio Menendez, and Alexandre Pouget
- Subjects
0301 basic medicine ,Quantitative Biology::Neurons and Cognition ,Computer science ,business.industry ,General Neuroscience ,Machine learning ,computer.software_genre ,Bayesian inference ,03 medical and health sciences ,Bayes' theorem ,030104 developmental biology ,0302 clinical medicine ,Postsynaptic potential ,Error bar ,Synaptic plasticity ,Falsifiability ,Probability distribution ,Artificial intelligence ,Set (psychology) ,business ,computer ,Neuroscience ,030217 neurology & neurosurgery - Abstract
Learning, especially rapid learning, is critical for survival. However, learning is hard; a large number of synaptic weights must be set based on noisy, often ambiguous, sensory information. In such a high-noise regime, keeping track of probability distributions over weights is the optimal strategy. Here we hypothesize that synapses take that strategy; in essence, when they estimate weights, they include error bars. They then use that uncertainty to adjust their learning rates, with more uncertain weights having higher learning rates. We also make a second, independent, hypothesis: synapses communicate their uncertainty by linking it to variability in postsynaptic potential size, with more uncertainty leading to more variability. These two hypotheses cast synaptic plasticity as a problem of Bayesian inference, and thus provide a normative view of learning. They generalize known learning rules, offer an explanation for the large variability in the size of postsynaptic potentials and make falsifiable experimental predictions.
- Published
- 2021
- Full Text
- View/download PDF