Back to Search Start Over

EXODUS: Stable and efficient training of spiking neural networks

Authors :
Felix C. Bauer
Gregor Lenz
Saeid Haghighatshoar
Sadique Sheik
Source :
Frontiers in Neuroscience, Vol 17 (2023)
Publication Year :
2023
Publisher :
Frontiers Media S.A., 2023.

Abstract

IntroductionSpiking Neural Networks (SNNs) are gaining significant traction in machine learning tasks where energy-efficiency is of utmost importance. Training such networks using the state-of-the-art back-propagation through time (BPTT) is, however, very time-consuming. Previous work employs an efficient GPU-accelerated backpropagation algorithm called SLAYER, which speeds up training considerably. SLAYER, however, does not take into account the neuron reset mechanism while computing the gradients, which we argue to be the source of numerical instability. To counteract this, SLAYER introduces a gradient scale hyper parameter across layers, which needs manual tuning.MethodsIn this paper, we modify SLAYER and design an algorithm called EXODUS, that accounts for the neuron reset mechanism and applies the Implicit Function Theorem (IFT) to calculate the correct gradients (equivalent to those computed by BPTT). We furthermore eliminate the need for ad-hoc scaling of gradients, thus, reducing the training complexity tremendously.ResultsWe demonstrate, via computer simulations, that EXODUS is numerically stable and achieves comparable or better performance than SLAYER especially in various tasks with SNNs that rely on temporal features.

Details

Language :
English
ISSN :
1662453X
Volume :
17
Database :
Directory of Open Access Journals
Journal :
Frontiers in Neuroscience
Publication Type :
Academic Journal
Accession number :
edsdoj.021534ced69f4b5b853f276ddd5227bf
Document Type :
article
Full Text :
https://doi.org/10.3389/fnins.2023.1110444