1. HELP: An LSTM-based approach to hyperparameter exploration in neural network learning.
- Author
-
Li, Wendi, Y. Ng, Wing W., Wang, Ting, Pelillo, Marcello, and Kwong, Sam
- Subjects
- *
CONVOLUTIONAL neural networks , *TIME series analysis , *ARTIFICIAL neural networks - Abstract
• HELP improves the efficiency of hyperparameter tunning with random exploration. • HELP avoids bad hyperparameters in future exploration by learning with an RNN. • HELP finds the optimal hyperparameters at iteration by information during training. Hyperparameter selection is very important for the success of deep neural network training. Random search of hyperparameters for deep neural networks may take a long time to converge and yield good results because the training of deep neural networks with a huge number of parameters for every selected hyperparameter is very time-consuming. In this work, we propose the Hyperparameter Exploration LSTM-Predictor (HELP) which is an improved random exploring method using a probability-based exploration with an LSTM-based prediction. The HELP has a higher probability to find a better hyperparameter with less time. The HELP uses a series of hyperparameters in a time period as input and predicts the fitness values of these hyperparameters. Then, exploration directions in the hyperparameter space yielding higher fitness values will have higher probabilities to be explored in the next turn. Experimental results for training both the Generative Adversarial Net and the Convolution Neural Network show that the HELP finds hyperparameters yielding better results and converges faster. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF