1. A universal ANN-to-SNN framework for achieving high accuracy and low latency deep Spiking Neural Networks.
- Author
-
Wang Y, Liu H, Zhang M, Luo X, and Qu H
- Subjects
- Databases, Factual, Visual Perception, Neural Networks, Computer, Neurons
- Abstract
Spiking Neural Networks (SNNs) have become one of the most prominent next-generation computational models owing to their biological plausibility, low power consumption, and the potential for neuromorphic hardware implementation. Among the various methods for obtaining available SNNs, converting Artificial Neural Networks (ANNs) into SNNs is the most cost-effective approach. The early challenges in ANN-to-SNN conversion work revolved around the susceptibility of converted SNNs to conversion errors. Some recent endeavors have attempted to mitigate these conversion errors by altering the original ANNs. Despite their ability to enhance the accuracy of SNNs, these methods lack generality and cannot be directly applied to convert the majority of existing ANNs. In this paper, we present a framework named DNISNM for converting ANN to SNN, with the aim of addressing conversion errors arising from differences in the discreteness and asynchrony of network transmission between ANN and SNN. The DNISNM consists of two mechanisms, Data-based Neuronal Initialization (DNI) and Signed Neuron with Memory (SNM), designed to respectively address errors stemming from discreteness and asynchrony disparities. This framework requires no additional modifications to the original ANN and can result in SNNs with improved accuracy performance, simultaneously ensuring universality, high precision, and low inference latency. We verify it experimentally on challenging object recognition datasets, including CIFAR10, CIFAR100, and ImageNet-1k. Experimental results show that the SNN converted by our framework has very high accuracy even at extremely low latency., Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2024 Elsevier Ltd. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF