1. Neural and Synaptic Array Transceiver: A Brain-Inspired Computing Framework for Embedded Learning
- Author
-
Georgios Detorakis, Sadique Sheik, Charles Augustine, Somnath Paul, Bruno U. Pedroni, Nikil Dutt, Jeffrey Krichmar, Gert Cauwenberghs, and Emre Neftci
- Subjects
0301 basic medicine ,FOS: Computer and information sciences ,Computer Science - Artificial Intelligence ,Computer science ,neuromorphic algorithms ,lcsh:RC321-571 ,03 medical and health sciences ,0302 clinical medicine ,Neuromorphic computing ,Reinforcement learning ,Neural and Evolutionary Computing (cs.NE) ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,three-factor learning ,Original Research ,Spiking neural network ,Event (computing) ,business.industry ,General Neuroscience ,Deep learning ,Computer Science - Neural and Evolutionary Computing ,event-based computing ,030104 developmental biology ,Artificial Intelligence (cs.AI) ,Computer architecture ,Neuromorphic engineering ,Embedded system ,spiking neural networks ,Unsupervised learning ,Robot ,on-line learning ,Artificial intelligence ,Sequence learning ,business ,030217 neurology & neurosurgery ,Neuroscience - Abstract
Embedded, continual learning for autonomous and adaptive behavior is a key application of neuromorphic hardware. However, neuromorphic implementations of embedded learning at large scales that are both flexible and efficient have been hindered by a lack of a suitable algorithmic framework. As a result, the most neuromorphic hardware is trained off-line on large clusters of dedicated processors or GPUs and transferred post hoc to the device. We address this by introducing the neural and synaptic array transceiver (NSAT), a neuromorphic computational framework facilitating flexible and efficient embedded learning by matching algorithmic requirements and neural and synaptic dynamics. NSAT supports event-driven supervised, unsupervised and reinforcement learning algorithms including deep learning. We demonstrate the NSAT in a wide range of tasks, including the simulation of Mihalas-Niebur neuron, dynamic neural fields, event-driven random back-propagation for event-based deep learning, event-based contrastive divergence for unsupervised learning, and voltage-based learning rules for sequence learning. We anticipate that this contribution will establish the foundation for a new generation of devices enabling adaptive mobile systems, wearable devices, and robots with data-driven autonomy.
- Published
- 2018