Back to Search
Start Over
A 24.1 TOPS/W mixed-signal BNN processor in 28-nm CMOS.
- Source :
- International Journal of Electronics; Aug2024, Vol. 111 Issue 8, p1288-1300, 13p
- Publication Year :
- 2024
-
Abstract
- A mixed-signal binarized neural network (BNN) processor for the MNIST image classification is demonstrated based on analogue circuit networks. BNN algorithm for training neural networks with binary weights and activations reduces power consumption and memory size. This algorithm is a design to perform core operations of multi-layer perceptron (MLP) to reduce complexity and power consumption using analogue circuits. The mixed-signal BNN processor employs a current mirror neuron to perform multiply-and-accumulate (MAC) operations and sign activation functions. The near-threshold current mirror neuron that computes the key operations of the BNN algorithm is used to achieve low power consumption. The design occupies 0.065 mm<superscript>2</superscript> in 28-nm CMOS with 560B of on-chip SRAM. The 28-nm CMOS test-chip achieves energy efficiency of 24.1 TOPS/W and 94% accuracy on the MNIST image classification. This design with binary weights and activations exhibits only 5% degradation in accuracy compared with the models with floating-point precision. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00207217
- Volume :
- 111
- Issue :
- 8
- Database :
- Complementary Index
- Journal :
- International Journal of Electronics
- Publication Type :
- Academic Journal
- Accession number :
- 177963992
- Full Text :
- https://doi.org/10.1080/00207217.2023.2224070