Back to Search Start Over

Theoretical understanding of gradients of spike functions as boolean functions

Authors :
DongHyung Yoo
Doo Seok Jeong
Source :
Complex & Intelligent Systems, Vol 11, Iss 1, Pp 1-17 (2024)
Publication Year :
2024
Publisher :
Springer, 2024.

Abstract

Abstract Applying an error-backpropagation algorithm to spiking neural networks frequently needs to employ fictive derivatives of spike functions (popularly referred to as surrogate gradients) because the spike function is considered non-differentiable. The non-differentiability comes into play given that the spike function is viewed as a numeric function, most popularly, the Heaviside step function of membrane potential. To get back to basics, the spike function is not a numeric but a Boolean function that outputs True or False upon the comparison of the current potential and threshold. In this regard, we propose a method to evaluate the gradient of spike function viewed as a Boolean function for fixed- and floating-point data formats. For both formats, the gradient is considerably similar to a delta function that peaks at the threshold for spiking, which justifies the approximation of the spike function to the Heaviside step function. Unfortunately, the error-backpropagation algorithm with this gradient function fails to outperform popularly employed surrogate gradients, which may arise from the narrow peak of the gradient function and consequent potential undershoot and overshoot around the spiking threshold with coarse timesteps. We provide theoretical grounds of this hypothesis.

Details

Language :
English
ISSN :
21994536 and 21986053
Volume :
11
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Complex & Intelligent Systems
Publication Type :
Academic Journal
Accession number :
edsdoj.86d30ee00ace4cb7886edc7008fb2a6f
Document Type :
article
Full Text :
https://doi.org/10.1007/s40747-024-01607-9