Back to Search
Start Over
Deep Phasor Networks: Connecting Conventional and Spiking Neural Networks
- Publication Year :
- 2021
-
Abstract
- In this work, we extend standard neural networks by building upon an assumption that neuronal activations correspond to the angle of a complex number lying on the unit circle, or 'phasor.' Each layer in such a network produces new activations by taking a weighted superposition of the previous layer's phases and calculating the new phase value. This generalized architecture allows models to reach high accuracy and carries the singular advantage that mathematically equivalent versions of the network can be executed with or without regard to a temporal variable. Importantly, the value of a phase angle in the temporal domain can be sparsely represented by a periodically repeating series of delta functions or 'spikes'. We demonstrate the atemporal training of a phasor network on standard deep learning tasks and show that these networks can then be executed in either the traditional atemporal domain or spiking temporal domain with no conversion step needed. This provides a novel basis for constructing deep networkswhich operate via temporal, spike-based calculations suitable for neuromorphic computing hardware.<br />Comment: 24 pages, 8 figures
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2106.11908
- Document Type :
- Working Paper