Back to Search Start Over

Analyzing Efficiency-based Hyperparameter Tuning Optimization Methods on LSTMs for Generative ARIMA Models.

Authors :
Taulananda, Anon
Source :
International Journal of High School Research; Sep2024, Vol. 6 Issue 9, p10-15, 6p
Publication Year :
2024

Abstract

In the deep learning field, Long Short-Term Memory networks (LSTMs) are a type of recurrent neural network (RNN) that use gates to control and retain important information, enabling them to handle long-range dependencies and effectively forecast time series models. Despite their effectiveness, LSTMs can be further optimized for increased accuracy and efficiency. Optimizing these networks involves fine-tuning hyperparameters, which are external parameters determined manually. While this can be done through trial and error, it is more efficiently accomplished using hyperparameter optimization algorithms such as random search, Bayesian optimization, and Hyperband. This paper compares these three algorithms, finding that random search achieved low mean-squared error but raised concerns about reproducibility and consistency, Hyperband excelled in resource efficiency but struggled with optimal configuration, and the Bayesian algorithm offered consistency and a smooth learning process at a higher computational cost and required parameter adjustments. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
26421046
Volume :
6
Issue :
9
Database :
Complementary Index
Journal :
International Journal of High School Research
Publication Type :
Academic Journal
Accession number :
180263349
Full Text :
https://doi.org/10.36838/v6i9.2