Back to Search Start Over

Efficient memristor accelerator for transformer self-attention functionality.

Authors :
Bettayeb, Meriem
Halawani, Yasmin
Khan, Muhammad Umair
Saleh, Hani
Mohammad, Baker
Source :
Scientific Reports; 10/15/2024, Vol. 14 Issue 1, p1-15, 15p
Publication Year :
2024

Abstract

The adoption of transformer networks has experienced a notable surge in various AI applications. However, the increased computational complexity, stemming primarily from the self-attention mechanism, parallels the manner in which convolution operations constrain the capabilities and speed of convolutional neural networks (CNNs). The self-attention algorithm, specifically the matrix-matrix multiplication (MatMul) operations, demands a substantial amount of memory and computational complexity, thereby restricting the overall performance of the transformer. This paper introduces an efficient hardware accelerator for the transformer network, leveraging memristor-based in-memory computing. The design targets the memory bottleneck associated with MatMul operations in the self-attention process, utilizing approximate analog computation and the highly parallel computations facilitated by the memristor crossbar architecture. Remarkably, this approach resulted in a reduction of approximately 10 times in the number of multiply-accumulate (MAC) operations in transformer networks, while maintaining 95.47% accuracy for the MNIST dataset, as validated by a comprehensive circuit simulator employing NeuroSim 3.0. Simulation outcomes indicate an area utilization of 6895.7 μ m 2 , a latency of 15.52 seconds, an energy consumption of 3 mJ, and a leakage power of 59.55 μ W . The methodology outlined in this paper represents a substantial stride towards a hardware-friendly transformer architecture for edge devices, poised to achieve real-time performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20452322
Volume :
14
Issue :
1
Database :
Complementary Index
Journal :
Scientific Reports
Publication Type :
Academic Journal
Accession number :
180284199
Full Text :
https://doi.org/10.1038/s41598-024-75021-z