Back to Search
Start Over
Variable three-term conjugate gradient method for training artificial neural networks.
- Source :
-
Neural Networks . Feb2023, Vol. 159, p125-136. 12p. - Publication Year :
- 2023
-
Abstract
- Artificial neural networks (ANNs) have been widely adopted as general computational tools both in computer science as well as many other engineering fields. Stochastic gradient descent (SGD) and adaptive methods such as Adam are popular as robust optimization algorithms used to train the ANNs. However, the effectiveness of these algorithms is limited because they calculate a search direction based on a first-order gradient. Although higher-order gradient methods such as Newton's method have been proposed, they require the Hessian matrix to be semi-definite, and its inversion incurs a high computational cost. Therefore, in this paper, we propose a variable three-term conjugate gradient (VTTCG) method that approximates the Hessian matrix to enhance search direction and uses a variable step size to achieve improved convergence stability. To evaluate the performance of the VTTCG method, we train different ANNs on benchmark image classification and generation datasets. We also conduct a similar experiment in which a grasp generation and selection convolutional neural network (GGS-CNN) is trained to perform intelligent robotic grasping. After considering a simulated environment, we also test the GGS-CNN with a physical grasping robot. The experimental results show that the performance of the VTTCG method is superior to that of four conventional methods, including SGD, Adam, AMSGrad, and AdaBelief. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 08936080
- Volume :
- 159
- Database :
- Academic Search Index
- Journal :
- Neural Networks
- Publication Type :
- Academic Journal
- Accession number :
- 161554326
- Full Text :
- https://doi.org/10.1016/j.neunet.2022.12.001