1. Analysis of the Strengths of Various Intermediate Layer Neurons of Radial Basis Function and Vanilla Neural Networks in the Prediction of Signal Power Loss Using Measurement Data from Micro-Cell Environment.
- Author
-
Ebhota, Virginia C. and Shongwe, Thokozani
- Subjects
ARTIFICIAL neural networks ,BAYESIAN analysis ,ELECTROMAGNETISM ,RADIAL basis functions ,MICROCELLULAR networks (Telecommunication) - Abstract
This work studied and compared the performances of VNN and RBF-NN models by variation of their individual intermediate layer neurons for effective prediction of electromagnetic signal power loss exerting measured data from micro-cell LTE environment. Their architectural structure, their individual characteristics and their training and prediction abilities in the prediction of signal power loss were studied and analyzed. Two different training techniques, the early stopping training technique and the Bayesian Regularization training technique were exerted for the training process and their performances compared. Results show superiority in the prediction of the measured dataset using 50 neurons in the intermediate layer of VNN and 70 neurons in the fixed intermediate layer of RBF-NN over all other various neuron number considered. Also, there is improved prediction using VNN over RBF-NN on application of Bayesian Regularization training technique and better performance using Bayesian Regularization training technique in comparison to early stopping training technique due to the ability of the Bayesian Regularization training technique to reduce both variance and bias during network training leading to improved generalization of the network. However, early stopping technique reduces variance but not bias. The VNN shows superior performance in the signal power loss prediction with the least RMSE, MAE, SD and highest r in comparison with the training results of RBF-NN model which requires more number of neurons in its fixed intermediate layer for more appropriate training. Also, training VNN requires lesser training time in comparison to training using RBF-NN model. The RBF-NN however shows good prediction performance in modeling of complex network. As the neuron numbers in the fixed intermediate layer get bigger, there prediction ability increases with better result output. Notwithstanding, training using RBF-NN requires more training time in comparison to training employing VNN. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF