Back to Search Start Over

An Improved Adam Optimization Algorithm Combining Adaptive Coefficients and Composite Gradients Based on Randomized Block Coordinate Descent

Authors :
Miaomiao Liu
Dan Yao
Zhigang Liu
Jingfeng Guo
Jing Chen
Source :
Computational Intelligence and Neuroscience. 2023:1-14
Publication Year :
2023
Publisher :
Hindawi Limited, 2023.

Abstract

An improved Adam optimization algorithm combining adaptive coefficients and composite gradients based on randomized block coordinate descent is proposed to address issues of the Adam algorithm such as slow convergence, the tendency to miss the global optimal solution, and the ineffectiveness of processing high-dimensional vectors. The adaptive coefficient is used to adjust the gradient deviation value and correct the search direction firstly. Then, the predicted gradient is introduced, and the current gradient and the first-order momentum are combined to form a composite gradient to improve the global optimization ability. Finally, the random block coordinate method is used to determine the gradient update mode, which reduces the computational overhead. Simulation experiments on two standard datasets for classification show that the convergence speed and accuracy of the proposed algorithm are higher than those of the six gradient descent methods, and the CPU and memory utilization are significantly reduced. In addition, based on logging data, the BP neural networks optimized by six algorithms, respectively, are used to predict reservoir porosity. Results show that the proposed method has lower system overhead, higher accuracy, and stronger stability, and the absolute error of more than 86% data is within 0.1%, which further verifies its effectiveness.

Details

ISSN :
16875273 and 16875265
Volume :
2023
Database :
OpenAIRE
Journal :
Computational Intelligence and Neuroscience
Accession number :
edsair.doi.dedup.....ed2428c0fbe7014a5f919b22f25fa745