1. Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks
- Author
-
Federico Corradi, Sander M. Bohte, Bojian Yin, and Artificial Intelligence
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Hardware implementations ,Computer Networks and Communications ,Computer science ,Computer Science::Neural and Evolutionary Computation ,Machine Learning (cs.LG) ,Models of neural computation ,Artificial Intelligence ,Neural and Evolutionary Computing (cs.NE) ,Time domain ,Network model ,Spiking neural network ,Quantitative Biology::Neurons and Cognition ,Artificial neural network ,business.industry ,Computer Science - Neural and Evolutionary Computing ,Pattern recognition ,Backpropagation ,Human-Computer Interaction ,Recurrent neural network ,Gesture recognition ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,Neural networks ,Software - Abstract
Inspired by more detailed modeling of biological neurons, Spiking neural networks (SNNs) have been investigated both as more biologically plausible and potentially more powerful models of neural computation, and also with the aim of extracting biological neurons' energy efficiency; the performance of such networks however has remained lacking compared to classical artificial neural networks (ANNs). Here, we demonstrate how a novel surrogate gradient combined with recurrent networks of tunable and adaptive spiking neurons yields state-of-the-art for SNNs on challenging benchmarks in the time-domain, like speech and gesture recognition. This also exceeds the performance of standard classical recurrent neural networks (RNNs) and approaches that of the best modern ANNs. As these SNNs exhibit sparse spiking, we show that they theoretically are one to three orders of magnitude more computationally efficient compared to RNNs with comparable performance. Together, this positions SNNs as an attractive solution for AI hardware implementations., Comment: 11 pages
- Published
- 2021