Back to Search Start Over

Q-SNNs: Quantized Spiking Neural Networks

Authors :
Wei, Wenjie
Liang, Yu
Belatreche, Ammar
Xiao, Yichen
Cao, Honglin
Ren, Zhenbang
Wang, Guoqing
Zhang, Malu
Yang, Yang
Publication Year :
2024

Abstract

Brain-inspired Spiking Neural Networks (SNNs) leverage sparse spikes to represent information and process them in an asynchronous event-driven manner, offering an energy-efficient paradigm for the next generation of machine intelligence. However, the current focus within the SNN community prioritizes accuracy optimization through the development of large-scale models, limiting their viability in resource-constrained and low-power edge devices. To address this challenge, we introduce a lightweight and hardware-friendly Quantized SNN (Q-SNN) that applies quantization to both synaptic weights and membrane potentials. By significantly compressing these two key elements, the proposed Q-SNNs substantially reduce both memory usage and computational complexity. Moreover, to prevent the performance degradation caused by this compression, we present a new Weight-Spike Dual Regulation (WS-DR) method inspired by information entropy theory. Experimental evaluations on various datasets, including static and neuromorphic, demonstrate that our Q-SNNs outperform existing methods in terms of both model size and accuracy. These state-of-the-art results in efficiency and efficacy suggest that the proposed method can significantly improve edge intelligent computing.<br />Comment: 8 pages, 5 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.13672
Document Type :
Working Paper