Back to Search
Start Over
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing
- Source :
- Neural Networks. 144:686-698
- Publication Year :
- 2021
- Publisher :
- Elsevier BV, 2021.
-
Abstract
- Spiking Neural Networks (SNNs) have recently emerged as a new generation of low-power deep neural networks due to sparse, asynchronous, and binary event-driven processing. Most previous deep SNN optimization methods focus on static datasets (e.g., MNIST) from a conventional frame-based camera. On the other hand, optimization techniques for event data from Dynamic Vision Sensor (DVS) cameras are still at infancy. Most prior SNN techniques handling DVS data are limited to shallow networks and thus, show low performance. Generally, we observe that the integrate-and-fire behavior of spiking neurons diminishes spike activity in deeper layers. The sparse spike activity results in a sub-optimal solution during training (i.e., performance degradation). To address this limitation, we propose novel algorithmic and architectural advances to accelerate the training of very deep SNNs on DVS data. Specifically, we propose Spike Activation Lift Training (SALT) which increases spike activity across all layers by optimizing both weights and thresholds in convolutional layers. After applying SALT, we train the weights based on the cross-entropy loss. SALT helps the networks to convey ample information across all layers during training and therefore improves the performance. Furthermore, we propose a simple and effective architecture, called Switched-BN, which exploits Batch Normalization (BN). Previous methods show that the standard BN is incompatible with the temporal dynamics of SNNs. Therefore, in Switched-BN architecture, we apply BN to the last layer of an SNN after accumulating all the spikes from previous layer with a spike voltage accumulator (i.e., converting temporal spike information to float value). Even though we apply BN in just one layer of SNNs, our results demonstrate a considerable performance gain without any significant computational overhead. Through extensive experiments, we show the effectiveness of SALT and Switched-BN for training very deep SNNs from scratch on various benchmarks including, DVS-Cifar10, N-Caltech, DHP19, CIFAR10, and CIFAR100. To the best of our knowledge, this is the first work showing state-of-the-art performance with deep SNNs on DVS data.
- Subjects :
- Neurons
Spiking neural network
Computer science
business.industry
Entropy
Cognitive Neuroscience
Frame (networking)
Normalization (image processing)
Pattern recognition
Knowledge
Neuromorphic engineering
Artificial Intelligence
Asynchronous communication
Spike (software development)
Neural Networks, Computer
Artificial intelligence
Accumulator (computing)
business
Vision, Ocular
MNIST database
Subjects
Details
- ISSN :
- 08936080
- Volume :
- 144
- Database :
- OpenAIRE
- Journal :
- Neural Networks
- Accession number :
- edsair.doi.dedup.....5c07c20c64b49466a4a24a216f862022
- Full Text :
- https://doi.org/10.1016/j.neunet.2021.09.022