Back to Search Start Over

Comprehensive analysis of gradient-based hyperparameter optimization algorithms.

Authors :
Bakhteev, O. Y.
Strijov, V. V.
Source :
Annals of Operations Research. Jun2020, Vol. 289 Issue 1, p51-65. 15p.
Publication Year :
2020

Abstract

The paper investigates hyperparameter optimization problem. Hyperparameters are the parameters of model parameter distribution. The adequate choice of hyperparameter values prevents model overfit and allows it to obtain higher predictive performance. Neural network models with large amount of hyperparameters are analyzed. The hyperparameter optimization for models is computationally expensive. The paper proposes modifications of various gradient-based methods to simultaneously optimize many hyperparameters. The paper compares the experiment results with the random search. The main impact of the paper is hyperparameter optimization algorithms analysis for the models with high amount of parameters. To select precise and stable models the authors suggest to use two model selection criteria: cross-validation and evidence lower bound. The experiments show that the models optimized using the evidence lower bound give higher error rate than the models obtained using cross-validation. These models also show greater stability when data is noisy. The evidence lower bound usage is preferable when the model tends to overfit or when the cross-validation is computationally expensive. The algorithms are evaluated on regression and classification datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02545330
Volume :
289
Issue :
1
Database :
Academic Search Index
Journal :
Annals of Operations Research
Publication Type :
Academic Journal
Accession number :
143492534
Full Text :
https://doi.org/10.1007/s10479-019-03286-z