Back to Search
Start Over
A Time Encoding Approach to Training Spiking Neural Networks
- Source :
- ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
- Publication Year :
- 2022
- Publisher :
- IEEE, 2022.
-
Abstract
- While Spiking Neural Networks (SNNs) have been gaining in popularity, it seems that the algorithms used to train them are not powerful enough to solve the same tasks as those tackled by classical Artificial Neural Networks (ANNs). In this paper, we provide an extra tool to help us understand and train SNNs by using theory from the field of time encoding. Time encoding machines (TEMs) can be used to model integrate-and-fire neurons and have well-understood reconstruction properties. We will see how one can take inspiration from the field of TEMs to interpret the spike times of SNNs as constraints on the SNNs' weight matrices. More specifically, we study how to train one-layer SNNs by solving a set of linear constraints, and how to train two-layer SNNs by leveraging the all-or-none and asynchronous properties of the spikes emitted by SNNs. These properties of spikes result in an alternative to backpropagation which is not possible in the case of simultaneous and graded activations as in classical ANNs.<br />Comment: 5 pages, 5 figures, submitted to IEEE ICASSP 2022
- Subjects :
- Signal Processing (eess.SP)
FOS: Computer and information sciences
Computer Science - Machine Learning
Quantitative Biology::Neurons and Cognition
Computer Science::Neural and Evolutionary Computation
FOS: Electrical engineering, electronic engineering, information engineering
Computer Science - Neural and Evolutionary Computing
Neural and Evolutionary Computing (cs.NE)
LCAV-MSP
Electrical Engineering and Systems Science - Signal Processing
Machine Learning (cs.LG)
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- Accession number :
- edsair.doi.dedup.....77470608f631e3d6a326248542cdd4ac
- Full Text :
- https://doi.org/10.1109/icassp43922.2022.9746319