Back to Search Start Over

BitS-Net: Bit-Sparse Deep Neural Network for Energy-Efficient RRAM-Based Compute-In-Memory.

Authors :
Karimzadeh, Foroozan
Yoon, Jong-Hyeok
Raychowdhury, Arijit
Source :
IEEE Transactions on Circuits & Systems. Part I: Regular Papers; May2022, Vol. 69 Issue 5, p1952-1961, 10p
Publication Year :
2022

Abstract

The rising popularity of intelligent mobile devices and the computational cost of deep learning-based models call for efficient and accurate on-device inference schemes. We propose a novel model compression scheme that allows inference to be carried out using bit-level sparsity, which can be efficiently implemented using in-memory computing macros. In this paper, we introduce a method called BitS-Net to leverage the benefits of bit-sparsity (where the number of zeros are more than number of ones in binary representation of weight/activation values) when applied to compute-in-memory (CIM) with resistive RAM (RRAM) to develop energy efficient DNN accelerators operating in the inference mode. We demonstrate that BitS-Net improves the energy efficiency by up to 5x for ResNet models on the ImageNet dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15498328
Volume :
69
Issue :
5
Database :
Complementary Index
Journal :
IEEE Transactions on Circuits & Systems. Part I: Regular Papers
Publication Type :
Periodical
Accession number :
156630290
Full Text :
https://doi.org/10.1109/TCSI.2022.3145687