1. Neuron Circuits for Low-Power Spiking Neural Networks Using Time-To-First-Spike Encoding
- Author
-
Seongbin Oh, Dongseok Kwon, Gyuho Yeom, Won-Mook Kang, Soochang Lee, Sung Yun Woo, Jaehyeon Kim, and Jong-Ho Lee
- Subjects
Neuromorphic ,spiking neural networks (SNNs) ,hardware-based neural networks ,time-to-first-spike (TTFS) coding ,temporal coding ,neuron circuits ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Hardware-based Spiking Neural Networks (SNNs) are regarded as promising candidates for the cognitive computing system due to its low power consumption and highly parallel operation. In this paper, we train the SNN in which the firing time carries information using temporal backpropagation. The temporally encoded SNN with 512 hidden neurons achieved an accuracy of 96.90% for the MNIST test set. Furthermore, the effect of the device variation on the accuracy in temporally encoded SNN is investigated and compared with that of the rate-encoded network. In a hardware configuration of our SNN, NOR-type analog memory having an asymmetric floating gate is used as a synaptic device. In addition, we propose a neuron circuit including a refractory period generator for temporally encoded SNN. The performance of the 2-layer neural network composed of synapses and proposed neurons is evaluated through circuit simulation using SPICE based on the BSIM3v3 model with $0.35~\mu \text{m}$ technology. The network with 128 hidden neurons achieved an accuracy of 94.9%, a 0.1% reduction compared to that of the system simulation of the MNIST dataset. Finally, each block’s latency and power consumption constituting the temporal network is analyzed and compared with those of the rate-encoded network depending on the total time step. Assuming that the network has 256 total time steps, the temporal network consumes 15.12 times less power than the rate-encoded network and makes decisions 5.68 times faster.
- Published
- 2022
- Full Text
- View/download PDF