1. Optimization of a Convolutional Neural Network Using a Hybrid Algorithm
- Author
-
Huang, CL, Shih, YC, Lai, CM, Ying Chung, VY, Zhu, WB, Yeh, WC, and He, X
- Abstract
© 2019 IEEE. In recent years, Convolutional Neural Networks (CNNs) have been widely used in image recognition due to their aptitude in large scale image processing. The CNN uses Back-propagation (BP) to train weights and biases, which in turn makes the error consistently smaller. The most common optimizers that uses a BP algorithm are Stochastic Gradient Decent (SGD), Adam, and Adadelta. These optimizers, however, have been proved to fall easily into the regional optimal solution. Little research has been conducted on the application of Soft Computing in CNN to fix the above problem, and most studies that have been conducted focus on Particle Swarm Optimization. Among them, the hybrid algorithm combined with SGD proposed by Albeahdili improves the image classification accuracy over that achieved by the original CNN. This study proposes the amalgamation of Improved Simplified Swarm Optimization (iSSO) with SGD, hence culminating in the iSSO-SGD which is intended train CNNs more efficiently to establish a better prediction model and improve the classification accuracy. The performance of the proposed iSSO-SGD can be affirmed through a comparison with the PSO-SGD, the Adam, Adadelta, rmsprop and momentum optimizers and their abilities in improving the accuracy of image classification.
- Published
- 2019