Back to Search
Start Over
Efficient Approximate Posit Multipliers for Deep Learning Computation
- Source :
- IEEE Journal of Emerging and Selected Topics in Circuits and Systems; 2023, Vol. 13 Issue: 1 p201-211, 11p
- Publication Year :
- 2023
-
Abstract
- Posit numeric format is getting more and more attention in recent years. Its tapered precision makes it especially suitable in many applications including deep learning computation. However, due to its dynamic component bit-width, the cost of implementing posit arithmetic in hardware is more expensive than its floating-point counterpart. To solve this cost problem, in this paper, posit multipliers with approximate computing features are proposed. The core idea of the proposed design is to truncate the fraction multiplier according to the estimated fraction bit-width of the product. So that the resource consumption of the fraction multiplier and thus the fraction adder can be significantly reduced. The proposed method is applied in both linear domain and logarithm domain posit multipliers. The 8/16/32-bit version of the proposed approximate posit multipliers are implemented and analyzed. For the commonly used 16-bit posit format in deep learning computation, the proposed approximate posit multiplier can consume 16% less power compared to the conventional posit multiplier design. The proposed 16-bit approximate logarithm multiplier can achieve a 15% improvement in terms of power consumption compared to the state-of-the-art posit approximate logarithm multiplier. The proposed 16-bit approximate posit multipliers are applied in the computation of several deep neural network models and significant improvements on energy efficiency can be achieved with negligible accuracy degradation.
Details
- Language :
- English
- ISSN :
- 21563357
- Volume :
- 13
- Issue :
- 1
- Database :
- Supplemental Index
- Journal :
- IEEE Journal of Emerging and Selected Topics in Circuits and Systems
- Publication Type :
- Periodical
- Accession number :
- ejs62603170
- Full Text :
- https://doi.org/10.1109/JETCAS.2022.3231642