Back to Search Start Over

StarSPA: Stride-Aware Sparsity Compression for Efficient CNN Acceleration

Authors :
Ngoc-Son Pham
Sangwon Shin
Lei Xu
Weidong Shi
Taeweon Suh
Source :
IEEE Access, Vol 12, Pp 10893-10909 (2024)
Publication Year :
2024
Publisher :
IEEE, 2024.

Abstract

The presence of sparsity in both input features and weights within convolutional neural networks offers a valuable opportunity to significantly reduce the number of computations required during inference. Moreover, the practice of compressing input data serves to diminish storage requirements and lower data transfer costs, ultimately enhancing overall power efficiency. However, the compression of randomly sparse inputs introduces challenges in the input matching process, often resulting in substantial hardware overhead and increased power consumption. These challenges arise due to the irregular nature of sparse inputs and the variability in convolutional strides. In response to these challenges, this research introduces an innovative data compression method, named Stride-Aware Sparsity Compression (StarSPA), designed to effectively locate valid input values and expedite the multiplication process. To fully capitalize on this proposed compression method, a weight-stationary approach is employed for efficient convolution. Comprehensive simulations demonstrate that the proposed accelerator achieves speedup factors of $1.17\times $ , $1.05\times $ , $1.09\times $ , $1.23\times $ , and $1.12\times $ when compared to the recent accelerator named SparTen for AlexNet, VGG16, GoogLeNet, ResNet34, and EfficientNetV2, respectively. Furthermore, FPGA implementation of the core reveals a noteworthy $2.55\times $ reduction in hardware size and a $5\times $ enhancement in energy efficiency when compared to SparTen.

Details

Language :
English
ISSN :
21693536
Volume :
12
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.fe69c126564b43d980f9fce76ec7a0a6
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2024.3353313