Back to Search
Start Over
TaxoNN: A Light-Weight Accelerator for Deep Neural Network Training
- Source :
- 2020 IEEE International Symposium on Circuits and Systems (ISCAS), 2020, pp. 1-5
- Publication Year :
- 2020
-
Abstract
- Emerging intelligent embedded devices rely on Deep Neural Networks (DNNs) to be able to interact with the real-world environment. This interaction comes with the ability to retrain DNNs, since environmental conditions change continuously in time. Stochastic Gradient Descent (SGD) is a widely used algorithm to train DNNs by optimizing the parameters over the training data iteratively. In this work, first we present a novel approach to add the training ability to a baseline DNN accelerator (inference only) by splitting the SGD algorithm into simple computational elements. Then, based on this heuristic approach we propose TaxoNN, a light-weight accelerator for DNN training. TaxoNN can easily tune the DNN weights by reusing the hardware resources used in the inference process using a time-multiplexing approach and low-bitwidth units. Our experimental results show that TaxoNN delivers, on average, 0.97% higher misclassification rate compared to a full-precision implementation. Moreover, TaxoNN provides 2.1$\times$ power saving and 1.65$\times$ area reduction over the state-of-the-art DNN training accelerator.<br />Comment: Accepted to ISCAS 2020. 5 pages, 5 figures
Details
- Database :
- arXiv
- Journal :
- 2020 IEEE International Symposium on Circuits and Systems (ISCAS), 2020, pp. 1-5
- Publication Type :
- Report
- Accession number :
- edsarx.2010.05197
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1109/ISCAS45731.2020.9181001