Back to Search Start Over

Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation

Authors :
Comsa, Iulia-Maria
Potempa, Krzysztof
Versari, Luca
Fischbacher, Thomas
Gesmundo, Andrea
Alakuijala, Jyrki
Source :
IEEE Transactions on Neural Networks and Learning Systems; October 2022, Vol. 33 Issue: 10 p5939-5952, 14p
Publication Year :
2022

Abstract

The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose a spiking neural network model that encodes information in the relative timing of individual spikes. In classification tasks, the output of the network is indicated by the first neuron to spike in the output layer. This temporal coding scheme allows the supervised training of the network with backpropagation, using locally exact derivatives of the postsynaptic spike times with respect to presynaptic spike times. The network operates using a biologically plausible synaptic transfer function. In addition, we use trainable pulses that provide bias, add flexibility during training, and exploit the decayed part of the synaptic function. We show that such networks can be successfully trained on multiple data sets encoded in time, including MNIST. Our model outperforms comparable spiking models on MNIST and achieves similar quality to fully connected conventional networks with the same architecture. The spiking network spontaneously discovers two operating modes, mirroring the accuracy-speed tradeoff observed in human decision-making: a highly accurate but slow regime, and a fast but slightly lower accuracy regime. These results demonstrate the computational power of spiking networks with biological characteristics that encode information in the timing of individual neurons. By studying temporal coding in spiking networks, we aim to create building blocks toward energy-efficient, state-based biologically inspired neural architectures. We provide open-source code for the model.

Details

Language :
English
ISSN :
2162237x and 21622388
Volume :
33
Issue :
10
Database :
Supplemental Index
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Publication Type :
Periodical
Accession number :
ejs60975418
Full Text :
https://doi.org/10.1109/TNNLS.2021.3071976