Back to Search Start Over

Deep Learning With Spiking Neurons: Opportunities and Challenges

Authors :
Thomas Pfeil
Michael Pfeiffer
Source :
Frontiers in Neuroscience, Frontiers in Neuroscience, Vol 12 (2018)
Publication Year :
2018
Publisher :
Frontiers Media SA, 2018.

Abstract

Spiking neural networks (SNNs) are inspired by information processing in biology, where sparse and asynchronous binary signals are communicated and processed in a massively parallel fashion. SNNs on neuromorphic hardware exhibit favorable properties such as low power consumption, fast inference, and event-driven information processing. This makes them interesting candidates for the efficient implementation of deep neural networks, the method of choice for many machine learning tasks. In this review, we address the opportunities that deep spiking networks offer and investigate in detail the challenges associated with training SNNs in a way that makes them competitive with conventional deep learning, but simultaneously allows for efficient mapping to hardware. A wide range of training methods for SNNs is presented, ranging from the conversion of conventional deep networks into SNNs, constrained training before conversion, spiking variants of backpropagation, and biologically motivated variants of STDP. The goal of our review is to define a categorization of SNN training methods, and summarize their advantages and drawbacks. We further discuss relationships between SNNs and binary networks, which are becoming popular for efficient digital hardware implementation. Neuromorphic hardware platforms have great potential to enable deep spiking networks in real-world applications. We compare the suitability of various neuromorphic systems that have been developed over the past years, and investigate potential use cases. Neuromorphic approaches and conventional machine learning should not be considered simply two solutions to the same classes of problems, instead it is possible to identify and exploit their task-specific advantages. Deep SNNs offer great opportunities to work with new types of event-based sensors, exploit temporal codes and local on-chip learning, and we have so far just scratched the surface of realizing these advantages in practical applications.

Details

Language :
English
ISSN :
1662453X
Volume :
12
Database :
OpenAIRE
Journal :
Frontiers in Neuroscience
Accession number :
edsair.doi.dedup.....29f4d7cafeceb1f5e1bf60d2de07d97b
Full Text :
https://doi.org/10.3389/fnins.2018.00774