Back to Search
Start Over
Common Kernels and Convolutions in Binary- and Ternary-Weight Neural Networks.
- Source :
- Journal of Circuits, Systems & Computers; Jul2021, Vol. 30 Issue 9, p1-19, 19p
- Publication Year :
- 2021
-
Abstract
- A new algorithm for extracting common kernels and convolutions to maximally eliminate the redundant operations among the convolutions in binary- and ternary-weight convolutional neural networks is presented. Precisely, we propose (1) a new algorithm of common kernel extraction to overcome the local and limited exploration of common kernel candidates by the existing method, and subsequently apply (2) a new concept of common convolution extraction to maximally eliminate the redundancy in the convolution operations. In addition, our algorithm is able to (3) tune in minimizing the number of resulting kernels for convolutions, thereby saving the total memory access latency for kernels. Experimental results on ternary-weight VGG-16 demonstrate that our convolution optimization algorithm is very effective, reducing the total number of operations for all convolutions by 2 5. 8 ∼ 2 6. 3 % , thereby reducing the total number of execution cycles on hardware platform by 22.4% while using 2. 7 ∼ 3. 8 % fewer kernels over that of the convolution utilizing the common kernels extracted by the state-of-the-art algorithm. [ABSTRACT FROM AUTHOR]
- Subjects :
- CONVOLUTIONAL neural networks
ALGORITHMS
Subjects
Details
- Language :
- English
- ISSN :
- 02181266
- Volume :
- 30
- Issue :
- 9
- Database :
- Complementary Index
- Journal :
- Journal of Circuits, Systems & Computers
- Publication Type :
- Academic Journal
- Accession number :
- 151831887
- Full Text :
- https://doi.org/10.1142/S0218126621501589