Back to Search
Start Over
An Improved Reacceleration Optimization Algorithm Based on the Momentum Method for Image Recognition.
- Source :
-
Mathematics (2227-7390) . Jun2024, Vol. 12 Issue 11, p1759. 15p. - Publication Year :
- 2024
-
Abstract
- The optimization algorithm plays a crucial role in image recognition by neural networks. However, it is challenging to accelerate the model's convergence and maintain high precision. As a commonly used stochastic gradient descent optimization algorithm, the momentum method requires many epochs to find the optimal parameters during model training. The velocity of its gradient descent depends solely on the historical gradients and is not subject to random fluctuations. To address this issue, an optimization algorithm to enhance the gradient descent velocity, i.e., the momentum reacceleration gradient descent (MRGD), is proposed. The algorithm utilizes the point division of the current momentum and the gradient relationship, multiplying it with the gradient. It can adjust the update rate and step size of the parameters based on the gradient descent state, so as to achieve faster convergence and higher precision in training the deep learning model. The effectiveness of this method is further proven by applying the reacceleration mechanism to the Adam optimizer, resulting in the MRGDAdam algorithm. We verify both algorithms using multiple image classification datasets, and the experimental results show that the proposed optimization algorithm enables the model to achieve higher recognition accuracy over a small number of training epochs, as well as speeding up model implementation. This study provides new ideas and expansions for future optimizer research. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 22277390
- Volume :
- 12
- Issue :
- 11
- Database :
- Academic Search Index
- Journal :
- Mathematics (2227-7390)
- Publication Type :
- Academic Journal
- Accession number :
- 177856913
- Full Text :
- https://doi.org/10.3390/math12111759