Back to Search Start Over

Design Space Exploration of Dense and Sparse Mapping Schemes for RRAM Architectures

Authors :
Lammie, Corey
Eshraghian, Jason K.
Li, Chenqi
Amirsoleimani, Amirali
Genov, Roman
Lu, Wei D.
Azghadi, Mostafa Rahimi
Publication Year :
2022

Abstract

The impact of device and circuit-level effects in mixed-signal Resistive Random Access Memory (RRAM) accelerators typically manifest as performance degradation of Deep Learning (DL) algorithms, but the degree of impact varies based on algorithmic features. These include network architecture, capacity, weight distribution, and the type of inter-layer connections. Techniques are continuously emerging to efficiently train sparse neural networks, which may have activation sparsity, quantization, and memristive noise. In this paper, we present an extended Design Space Exploration (DSE) methodology to quantify the benefits and limitations of dense and sparse mapping schemes for a variety of network architectures. While sparsity of connectivity promotes less power consumption and is often optimized for extracting localized features, its performance on tiled RRAM arrays may be more susceptible to noise due to under-parameterization, when compared to dense mapping schemes. Moreover, we present a case study quantifying and formalizing the trade-offs of typical non-idealities introduced into 1-Transistor-1-Resistor (1T1R) tiled memristive architectures and the size of modular crossbar tiles using the CIFAR-10 dataset.<br />Comment: Accepted at 2022 IEEE International Symposium on Circuits and Systems (ISCAS). [v2] Fixed incorrectly labeled author affiliations for Chenqi Li, Amirali Amirsoleimani, and Roman Genov

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2201.06703
Document Type :
Working Paper