Back to Search
Start Over
Enhanced harmony search for hyperparameter tuning of deep neural networks.
- Source :
-
Soft Computing - A Fusion of Foundations, Methodologies & Applications . Sep2024, Vol. 28 Issue 17/18, p9905-9919. 15p. - Publication Year :
- 2024
-
Abstract
- The performance of a deep neural network is affected by its configuration as well as its training process. Determining the configuration of a DNN and training its parameters are challenging tasks due to high-dimensional problems. Therefore, there is a need for methods that can optimize the configuration and parameters of a DNN. Most of the existing DNN optimization research concerns the optimization of DNN parameters, and there are only a few studies discussing the optimization of DNN configuration. In this paper, enhanced harmony search is proposed to optimize the configuration of a fully connected neural network. The proposed harmony search enhancement is conducted by introducing various types of harmony memory consideration rate and various types of harmony memory selection. Four types of harmony memory consideration rate are proposed in this research: constant rate, linear increase rate, linear decrease rate, and sigmoid rate. Two types of harmony memory selection are proposed in this research: rank-based selection and random selection. The combination of types of harmony memory consideration rate and types of selection generates eight harmony search scenarios. The performance of the proposed method is compared to random search and genetic algorithm using 12 datasets of classification problems. The experiment results show that the proposed harmony search outperforms random search in 8 out of 12 problems and approximately has the same performance in 4 problems. Harmony search also outperforms genetic algorithm in five problems, approximately has the same performance in six problems, and has worse performance in one problem. In addition, combining various types of harmony memory consideration rate and rank-based selection increases the performance of the ordinary harmony search. The combination of harmony memory consideration with linear increase rate and rank-based selection performs the best among all combinations. It is better than the ordinary harmony search in seven problems, approximately equal in three problems, and worse in two problems. The results show that the proposed method has some advantages in solving classification problems using a DNN. First, the configuration of the DNN is represented as an optimization problem so that it can be used to find a specific FCNN configuration that is suitable for a specific problem. Second, the approach is a global optimization approach as it tunes the DNN hyperparameter (configuration) as well as the DNN parameter (connection weight). Therefore, it is able to find the best combination of DNN configuration as well as its connection weight. However, there is a need to develop a strategy to balance the hyperparameter tuning and the parameter tuning. Inappropriate balance could lead to a high computational cost. Future research can be directed to balance the hyperparameter and parameter tuning during the solution search. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 14327643
- Volume :
- 28
- Issue :
- 17/18
- Database :
- Academic Search Index
- Journal :
- Soft Computing - A Fusion of Foundations, Methodologies & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 180373705
- Full Text :
- https://doi.org/10.1007/s00500-024-09840-7