Back to Search Start Over

Deep Neural Network Hyperparameter Optimization with Orthogonal Array Tuning

Authors :
Chang Ge
Xiaocong Chen
Manqing Dong
Xiang Zhang
Lina Yao
Source :
Communications in Computer and Information Science ISBN: 9783030368074, ICONIP (4)
Publication Year :
2019
Publisher :
Springer International Publishing, 2019.

Abstract

Deep learning algorithms have achieved excellent performance lately in a wide range of fields (e.g., computer version). However, a severe challenge faced by deep learning is the high dependency on hyper-parameters. The algorithm results may fluctuate dramatically under the different configuration of hyper-parameters. Addressing the above issue, this paper presents an efficient Orthogonal Array Tuning Method (OATM) for deep learning hyper-parameter tuning. We describe the OATM approach in five detailed steps and elaborate on it using two widely used deep neural network structures (Recurrent Neural Networks and Convolutional Neural Networks). The proposed method is compared to the state-of-the-art hyper-parameter tuning methods including manually (e.g., grid search and random search) and automatically (e.g., Bayesian Optimization) ones. The experiment results state that OATM can significantly save the tuning time compared to the state-of-the-art methods while preserving the satisfying performance.

Details

ISBN :
978-3-030-36807-4
ISBNs :
9783030368074
Database :
OpenAIRE
Journal :
Communications in Computer and Information Science ISBN: 9783030368074, ICONIP (4)
Accession number :
edsair.doi...........2ddd1341f41a983c81b1a32189e02705
Full Text :
https://doi.org/10.1007/978-3-030-36808-1_31