Back to Search Start Over

An Energy-Efficient Bayesian Neural Network Implementation Using Stochastic Computing Method

Authors :
Jia, Xiaotao
Gu, Huiyi
Liu, Yuhao
Yang, Jianlei
Wang, Xueyan
Pan, Weitao
Zhang, Youguang
Cotofana, Sorin
Zhao, Weisheng
Source :
IEEE Transactions on Neural Networks and Learning Systems; September 2024, Vol. 35 Issue: 9 p12913-12923, 11p
Publication Year :
2024

Abstract

The robustness of Bayesian neural networks (BNNs) to real-world uncertainties and incompleteness has led to their application in some safety-critical fields. However, evaluating uncertainty during BNN inference requires repeated sampling and feed-forward computing, making them challenging to deploy in low-power or embedded devices. This article proposes the use of stochastic computing (SC) to optimize the hardware performance of BNN inference in terms of energy consumption and hardware utilization. The proposed approach adopts bitstream to represent Gaussian random number and applies it in the inference phase. This allows for the omission of complex transformation computations in the central limit theorem-based Gaussian random number generating (CLT-based GRNG) method and the simplification of multipliers as AND operations. Furthermore, an asynchronous parallel pipeline calculation technique is proposed in computing block to enhance operation speed. Compared with conventional binary radix-based BNN, SC-based BNN (StocBNN) realized by FPGA with 128-bit bitstream consumes much less energy consumption and hardware resources with less than 0.1% accuracy decrease when dealing with MNIST/Fashion-MNIST datasets.

Details

Language :
English
ISSN :
2162237x and 21622388
Volume :
35
Issue :
9
Database :
Supplemental Index
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Publication Type :
Periodical
Accession number :
ejs67330701
Full Text :
https://doi.org/10.1109/TNNLS.2023.3265533