Back to Search Start Over

Stochastic Synapses Enable Efficient Brain-Inspired Learning Machines

Authors :
Neftci, Emre O.
Pedroni, Bruno U.
Joshi, Siddharth
Al-Shedivat, Maruan
Cauwenberghs, Gert
Source :
Frontiers in Neuroscience 10 (2016): 241
Publication Year :
2015

Abstract

Recent studies have shown that synaptic unreliability is a robust and sufficient mechanism for inducing the stochasticity observed in cortex. Here, we introduce Synaptic Sampling Machines, a class of neural network models that uses synaptic stochasticity as a means to Monte Carlo sampling and unsupervised learning. Similar to the original formulation of Boltzmann machines, these models can be viewed as a stochastic counterpart of Hopfield networks, but where stochasticity is induced by a random mask over the connections. Synaptic stochasticity plays the dual role of an efficient mechanism for sampling, and a regularizer during learning akin to DropConnect. A local synaptic plasticity rule implementing an event-driven form of contrastive divergence enables the learning of generative models in an on-line fashion. Synaptic sampling machines perform equally well using discrete-timed artificial units (as in Hopfield networks) or continuous-timed leaky integrate & fire neurons. The learned representations are remarkably sparse and robust to reductions in bit precision and synapse pruning: removal of more than 75% of the weakest connections followed by cursory re-learning causes a negligible performance loss on benchmark classification tasks. The spiking neuron-based synaptic sampling machines outperform existing spike-based unsupervised learners, while potentially offering substantial advantages in terms of power and complexity, and are thus promising models for on-line learning in brain-inspired hardware.

Details

Database :
arXiv
Journal :
Frontiers in Neuroscience 10 (2016): 241
Publication Type :
Report
Accession number :
edsarx.1511.04484
Document Type :
Working Paper
Full Text :
https://doi.org/10.3389/fnins.2016.00241