Back to Search Start Over

A Deep Neural Network Training Architecture With Inference-Aware Heterogeneous Data-Type

Authors :
Jaekang Shin
Lee-Sup Kim
Seungkyu Choi
Source :
IEEE Transactions on Computers. 71:1216-1229
Publication Year :
2022
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2022.

Abstract

As deep learning applications often encounter accuracy degradation due to the distorted inputs from a variety of environmental conditions, training with personal data has become essential for the edge devices. Hence, training on edge by supporting a trainable deep learning accelerator has been actively studied. Nevertheless, previous research does not consider the fundamental datapath for training and the importance of retaining the high performance for inference tasks. In this work, we propose NeuroFlix, a deep neural network training accelerator supporting heterogeneous data-type of floating- and fixed-point for input operands. From two perspectives: 1)separate precision decision for each input data, 2)maintenance of high performance on inference, we configure the data with low-bit fixed-point of activation/weight and floating-point based error gradient securing up to half-precision. A novel MAC architecture is designed to compute low/high-precision modes for the different input combinations. By substituting a high-cost floating-point based addition to brick-level separate accumulations, we realize both area-efficient architecture and high throughput for low-precision computation. Consequently, NeuroFlix outperforms the previous architectures of state-of-the-art configurations proving its high efficiency in both training and inference. By also comparing with the off-the-shelf bfloat16-based accelerator, it achieves 1.2/2.0 of speedup/energy-efficiency at training and further enhancement of 3.6/4.5 at inference.

Details

ISSN :
23263814 and 00189340
Volume :
71
Database :
OpenAIRE
Journal :
IEEE Transactions on Computers
Accession number :
edsair.doi...........a12eb2ec008ba7e73b4cb1d66e692c32
Full Text :
https://doi.org/10.1109/tc.2021.3078316