Back to Search
Start Over
Transfer Learning for Performance Modeling of Deep Neural Network Systems
- Source :
- Scopus-Elsevier
- Publication Year :
- 2019
- Publisher :
- arXiv, 2019.
-
Abstract
- Modern deep neural network (DNN) systems are highly configurable with large a number of options that significantly affect their non-functional behavior, for example inference time and energy consumption. Performance models allow to understand and predict the effects of such configuration options on system behavior, but are costly to build because of large configuration spaces. Performance models from one environment cannot be transferred directly to another; usually models are rebuilt from scratch for different environments, for example different hardware. Recently, transfer learning methods have been applied to reuse knowledge from performance models trained in one environment in another. In this paper, we perform an empirical study to understand the effectiveness of different transfer learning strategies for building performance models of DNN systems. Our results show that transferring information on the most influential configuration options and their interactions is an effective way of reducing the cost to build performance models in new environments.<br />Comment: 2 pages, 2 figures, USENIX Conference on Operational Machine Learning, 2019
Details
- Database :
- OpenAIRE
- Journal :
- Scopus-Elsevier
- Accession number :
- edsair.doi.dedup.....05d3630b20a762d4cd9696584515afce
- Full Text :
- https://doi.org/10.48550/arxiv.1904.02838