Back to Search Start Over

Hyperparameter Tuning in Deep Learning-Based Image Classification to Improve Accuracy using Adam Optimization.

Authors :
Sekar, Janarthanan
T, Ganesh Kumar
Source :
International Journal of Performability Engineering; Sep2023, Vol. 19 Issue 9, p579-586, 8p
Publication Year :
2023

Abstract

Deep learning (DL) is a cutting-edge image-processing technology that includes various satellite image sources being processed to analyze, enhance, and classify. This article covers a multilayer DL framework that classifies different types of vegetation and land cover using IRS p6 satellite images from many time scales and sources. The core of the design is an ensemble of supervised NNs and unsupervised neural networks (NNs) for optical imaging categorization and incomplete data restitution due to mists, reflections, and other natural effects that affect images. In this article, we contrast the traditional densely integrated multilayer perceptron (MLP) with the most popular method in remote sensing field random forest as the basic supervised NN architecture with convolutional NNs (CNNs). In general, utilizing the aforementioned procedure reduced accuracy and required longer computation times to train the model which produced 94.3%. The hyperparameters to adjust are the number of neurons, input layer, optimizer, number of epochs, filter size, and iterations. The second stage involves adjusting the number of layers. Some other conventional algorithms are lacking in this. The accuracy might be impacted by many layers. To overcome that, applying the Adam optimizer will produce a higher accuracy level of 96.72% with faster computation time and less memory management. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09731318
Volume :
19
Issue :
9
Database :
Supplemental Index
Journal :
International Journal of Performability Engineering
Publication Type :
Academic Journal
Accession number :
173498273
Full Text :
https://doi.org/10.23940/ijpe.23.09.p3.579586