Back to Search Start Over

Evaluation of Parameter Settings for Training Neural Networks Using Backpropagation Algorithms

Authors :
null Leema N.
Khanna H. Nehemiah
null Elgin Christo V. R.
null Kannan A.
Publication Year :
2022
Publisher :
IGI Global, 2022.

Abstract

Artificial neural networks (ANN) are widely used for classification, and the training algorithm commonly used is the backpropagation (BP) algorithm. The major bottleneck faced in the backpropagation neural network training is in fixing the appropriate values for network parameters. The network parameters are initial weights, biases, activation function, number of hidden layers and the number of neurons per hidden layer, number of training epochs, learning rate, minimum error, and momentum term for the classification task. The objective of this work is to investigate the performance of 12 different BP algorithms with the impact of variations in network parameter values for the neural network training. The algorithms were evaluated with different training and testing samples taken from the three benchmark clinical datasets, namely, Pima Indian Diabetes (PID), Hepatitis, and Wisconsin Breast Cancer (WBC) dataset obtained from the University of California Irvine (UCI) machine learning repository.

Details

Database :
OpenAIRE
Accession number :
edsair.doi...........00d604ae10d179a50d598c1799432017