1. A Deep Neural Network Training Architecture With Inference-Aware Heterogeneous Data-Type
- Author
-
Jaekang Shin, Lee-Sup Kim, and Seungkyu Choi
- Subjects
Speedup ,Artificial neural network ,Edge device ,Computer science ,business.industry ,Deep learning ,Inference ,Data type ,Theoretical Computer Science ,Computational Theory and Mathematics ,Computer engineering ,Hardware and Architecture ,Datapath ,Artificial intelligence ,business ,Throughput (business) ,Software - Abstract
As deep learning applications often encounter accuracy degradation due to the distorted inputs from a variety of environmental conditions, training with personal data has become essential for the edge devices. Hence, training on edge by supporting a trainable deep learning accelerator has been actively studied. Nevertheless, previous research does not consider the fundamental datapath for training and the importance of retaining the high performance for inference tasks. In this work, we propose NeuroFlix, a deep neural network training accelerator supporting heterogeneous data-type of floating- and fixed-point for input operands. From two perspectives: 1)separate precision decision for each input data, 2)maintenance of high performance on inference, we configure the data with low-bit fixed-point of activation/weight and floating-point based error gradient securing up to half-precision. A novel MAC architecture is designed to compute low/high-precision modes for the different input combinations. By substituting a high-cost floating-point based addition to brick-level separate accumulations, we realize both area-efficient architecture and high throughput for low-precision computation. Consequently, NeuroFlix outperforms the previous architectures of state-of-the-art configurations proving its high efficiency in both training and inference. By also comparing with the off-the-shelf bfloat16-based accelerator, it achieves 1.2/2.0 of speedup/energy-efficiency at training and further enhancement of 3.6/4.5 at inference.
- Published
- 2022
- Full Text
- View/download PDF