1. Multilayer Dataflow based Butterfly Sparsity Orchestration to Accelerate Attention Workloads
- Author
-
Wu, Haibin, Li, Wenming, Yan, Kai, Fan, Zhihua, Liu, Tianyu, Liu, Yuqun, Liu, Yanhuan, Qiang, Ziqing, Ye, Xiaochun, and Fan, Dongrui
- Subjects
Computer Science - Hardware Architecture - Abstract
Recent neural networks (NNs) with self-attention exhibit competitiveness across different AI domains, but the essential attention mechanism brings massive computation and memory demands. To this end, various sparsity patterns are introduced to reduce the quadratic computation complexity, among which the structured butterfly sparsity has been proven efficient in computation reduction while maintaining model accuracy. However, its complicated data accessing pattern brings utilization degradation and makes parallelism hard to exploit in general block-oriented architecture like GPU. Since the reconfigurable dataflow architecture is known to have better data reusability and architectural flexibility in general NN-based acceleration, we want to apply it to the butterfly sparsity for acquiring better computational efficiency for attention workloads. We first propose a hybrid butterfly-sparsity network to obtain better trade-offs between attention accuracy and performance. Next, we propose a scalable multilayer dataflow method supported by coarse-grained streaming parallelism designs, to orchestrate the butterfly sparsity computation on the dataflow array. The experiments show that compared with Jetson Xavier NX, our design has a speedup of up to $14.34\times$ ($9.29\times$ on average) as well as $11.14\times$ energy efficiency advancement in attention workloads. In comparison with SOTA attention accelerators of the same peak performance, our dataflow architecture acquires $2.38\times$-$4.7\times$ efficiency improvement as well as $6.60\times$-$15.37\times$ energy reduction with butterfly sparsity optimization., Comment: 9 pages, 17 figures, ICCAD 2024, 2024/07/05, Butterfly Sparsity Optimization Using Dataflow
- Published
- 2024