1. Decision support system for nasopharyngeal carcinoma discrimination from endoscopic images using artificial neural network.
- Author
-
Mohammed, Mazin Abed, Abd Ghani, Mohd Khanapi, Arunkumar, N., Hamed, Raed Ibraheem, Mostafa, Salama A., Abdullah, Mohamad Khir, and Burhanuddin, M. A.
- Subjects
DECISION support systems ,HISTOGRAMS ,CARCINOMA ,BENIGN tumors ,CANCER ,FRACTAL dimensions - Abstract
The segregation among benign and malignant nasopharyngeal carcinoma (NPC) from endoscopic images is one of the most challenging issues in cancer diagnosis because of the many conceivable shapes, regions, and image intensities, hence, a proper scientific technique is required to extract the features of cancerous NPC tumors. In the present research, a neural network-based automated discrimination system was implemented for the identification of malignant NPC tumors. In the proposed technique, five different types of qualities, such as local binary pattern, gray-level co-occurrence matrix, histogram of oriented gradients, fractal dimension, and entropy, were first determined from the endoscopic images of NPC tumors and then the following steps were executed: (1) an enhanced adaptive approach was employed as the post-processing method for the classification of NPC tumors, (2) an assessment foundation was created for the automated identification of malignant NPC tumors, (3) the benign and cancerous cases were discriminated by using region growing method and artificial neural network (ANN) approach, and (4) the efficiency of the outcomes was evaluated by comparing the results of ANN. In addition, it was found that texture features had significant effects on isolating benign tumors from malignant cases. It can be concluded that in our proposed method texture features acted as a pointer as well as a help instrument to diagnose the malignant NPC tumors. In order to examine the accuracy of our proposed approach, 159 abnormal and 222 normal cases endoscopic images were acquired from 249 patients, and the classifier yielded 95.66% precision, 95.43% sensitivity, and 95.78% specificity. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF