Back to Search Start Over

Adaptive Deep Neural Network Inference Optimization with EENet

Authors :
Ilhan, Fatih
Chow, Ka-Ho
Hu, Sihao
Huang, Tiansheng
Tekin, Selim
Wei, Wenqi
Wu, Yanzhao
Lee, Myungjin
Kompella, Ramana
Latapie, Hugo
Liu, Gaowen
Liu, Ling
Publication Year :
2023

Abstract

Well-trained deep neural networks (DNNs) treat all test samples equally during prediction. Adaptive DNN inference with early exiting leverages the observation that some test examples can be easier to predict than others. This paper presents EENet, a novel early-exiting scheduling framework for multi-exit DNN models. Instead of having every sample go through all DNN layers during prediction, EENet learns an early exit scheduler, which can intelligently terminate the inference earlier for certain predictions, which the model has high confidence of early exit. As opposed to previous early-exiting solutions with heuristics-based methods, our EENet framework optimizes an early-exiting policy to maximize model accuracy while satisfying the given per-sample average inference budget. Extensive experiments are conducted on four computer vision datasets (CIFAR-10, CIFAR-100, ImageNet, Cityscapes) and two NLP datasets (SST-2, AgNews). The results demonstrate that the adaptive inference by EENet can outperform the representative existing early exit techniques. We also perform a detailed visualization analysis of the comparison results to interpret the benefits of EENet.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2301.07099
Document Type :
Working Paper