1. IMBUE: In-Memory Boolean-to-CUrrent Inference ArchitecturE for Tsetlin Machines
- Author
-
Ghazal, Omar, Singh, Simranjeet, Rahman, Tousif, Yu, Shengqi, Zheng, Yujin, Balsamo, Domenico, Patkar, Sachin, Merchant, Farhad, Xia, Fei, Yakovlev, Alex, and Shafik, Rishad
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Artificial Intelligence (cs.AI) ,Emerging Technologies (cs.ET) ,Computer Science - Artificial Intelligence ,Hardware Architecture (cs.AR) ,Computer Science - Emerging Technologies ,Computer Science - Hardware Architecture ,Machine Learning (cs.LG) - Abstract
In-memory computing for Machine Learning (ML) applications remedies the von Neumann bottlenecks by organizing computation to exploit parallelism and locality. Non-volatile memory devices such as Resistive RAM (ReRAM) offer integrated switching and storage capabilities showing promising performance for ML applications. However, ReRAM devices have design challenges, such as non-linear digital-analog conversion and circuit overheads. This paper proposes an In-Memory Boolean-to-Current Inference Architecture (IMBUE) that uses ReRAM-transistor cells to eliminate the need for such conversions. IMBUE processes Boolean feature inputs expressed as digital voltages and generates parallel current paths based on resistive memory states. The proportional column current is then translated back to the Boolean domain for further digital processing. The IMBUE architecture is inspired by the Tsetlin Machine (TM), an emerging ML algorithm based on intrinsically Boolean logic. The IMBUE architecture demonstrates significant performance improvements over binarized convolutional neural networks and digital TM in-memory implementations, achieving up to a 12.99x and 5.28x increase, respectively., Comment: Accepted at ACM/IEEE International Symposium on Low Power Electronics and Design 2023 (ISLPED 2023)
- Published
- 2023
- Full Text
- View/download PDF