Back to Search Start Over

Towards Efficient RRAM-based Quantized Neural Networks Hardware: State-of-the-art and Open Issues

Authors :
Krestinskaya, O.
Zhang, L.
Salama, K. N.
Publication Year :
2022

Abstract

The increasing amount of data processed on edge and the demand for reducing the energy consumption for large neural network architectures have initiated the transition from traditional von Neumann architectures towards in-memory computing paradigms. Quantization is one of the methods to reduce power and computation requirements for neural networks by limiting bit precision. Resistive Random Access Memory (RRAM) devices are great candidates for Quantized Neural Networks (QNN) implementations. As the number of possible conductive states in RRAMs is limited, a certain level of quantization is always considered when designing RRAM-based neural networks. In this work, we provide a comprehensive analysis of state-of-the-art RRAM-based QNN implementations, showing where RRAMs stand in terms of satisfying the criteria of efficient QNN hardware. We cover hardware and device challenges related to QNNs and show the main unsolved issues and possible future research directions.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2209.12260
Document Type :
Working Paper