1. Polarity is all you need to learn and transfer faster
- Author
-
Wang, Qingyang, Powell, Michael A., Geisa, Ali, Bridgeford, Eric W., and Vogelstein, Joshua T.
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Computer Vision and Pattern Recognition (cs.CV) ,FOS: Biological sciences ,Quantitative Biology - Neurons and Cognition ,Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Neural and Evolutionary Computing ,Neurons and Cognition (q-bio.NC) ,Neural and Evolutionary Computing (cs.NE) ,Machine Learning (cs.LG) - Abstract
Natural intelligences (NIs) thrive in a dynamic world - they learn quickly, sometimes with only a few samples. In contrast, artificial intelligences (AIs) typically learn with a prohibitive number of training samples and computational power. What design principle difference between NI and AI could contribute to such a discrepancy? Here, we investigate the role of weight polarity: development processes initialize NIs with advantageous polarity configurations; as NIs grow and learn, synapse magnitudes update, yet polarities are largely kept unchanged. We demonstrate with simulation and image classification tasks that if weight polarities are adequately set a priori, then networks learn with less time and data. We also explicitly illustrate situations in which a priori setting the weight polarities is disadvantageous for networks. Our work illustrates the value of weight polarities from the perspective of statistical and computational efficiency during learning., ICML camera-ready
- Published
- 2023