201. Variation Tolerant RRAM Based Synaptic Architecture for On-Chip Training.
- Author
-
Dongre, Ashvinikumar and Trivedi, Gaurav
- Abstract
Neuromorphic computing has emerged as a better alternative for developing next-generation artificial intelligent systems. Resistive Random Access Memory (RRAM) have been widely explored to represent the synaptic weights in artificial neural networks. Despite significant advances in device technology, precise modulation of conductance required for maintaining high accuracy remains a challenge. These devices suffer mainly from Cycle-to-Cycle and Device-to-Device variations, which make it even more difficult to implement reliable neuromorphic systems. To address these issues, we propose an RRAM-based synaptic architecture with continuous sensing and feedback scheme to stop RRAM programming when the required conductance is achieved. The weight change mechanism is incorporated in the proposed architecture enabling it to be used for on-chip training. Unlike contemporary architectures, which require a precise gap between different resistive states, the proposed feedback scheme to stop RESET operation provides flexibility in choosing resistive states. Thus, employing the proposed architecture, 4-bits/cell can be programmed simultaneously, which is at par with the existing start-of-the-art designs. The proposed architecture is tolerant to 22% variation in RRAM resistance. The minimal resistance margin between two resistive states is $4\ K\Omega$ , with an average energy consumption of $0.1\ pJ/cell$ and an average latency of $1.07\, \mu s$. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF