Back to Search
Start Over
Self-Checking Residue Number System for Low-Power Reliable Neural Network
- Source :
- ATS
- Publication Year :
- 2019
- Publisher :
- IEEE, 2019.
-
Abstract
- Neural Network suffers four major issues including acceleration, power consumption, area overhead and fault tolerance. In this paper we develop a systematic approach to design a low-power, compact, fast and reliable neural network based on a redundant residue number system. Residue number systems have been applied in designing neural network except the CORDIC-based activation functions including hypertangent, logistic and softmax functions. This issue results in that the entire neural network cannot be totally self-checked and extra operations make the time, power and area reductions wasted. In our systematic approach we propose some design rules for ensuring the checking rate without loss of reductions in time, area and power consumption. From experiments on three neu-ral network with 24-bit fixed-point operations for the MNIST handwritten digit data set, 3, 4, and 5 moduli are separately employed for achieving all balanced improvements in power-saving, area-reduction, speed-acceleration and reliability pro-motion. Experimental results show that all the power, time and area can be reduced to only about one third, and the entire network in any combination of software and hardware can be self-checked in an aliasing rate of only 0.39% and TMR-correctable under the single-residue fault model.
Details
- Database :
- OpenAIRE
- Journal :
- 2019 IEEE 28th Asian Test Symposium (ATS)
- Accession number :
- edsair.doi...........edca52f9832914077343c1c5c3e67003
- Full Text :
- https://doi.org/10.1109/ats47505.2019.000-3