Back to Search Start Over

Gastrointestinal Tract Disease Classification from Wireless Endoscopy Images Using Pretrained Deep Learning Model.

Authors :
Yogapriya J
Chandran V
Sumithra MG
Anitha P
Jenopaul P
Suresh Gnana Dhas C
Source :
Computational and mathematical methods in medicine [Comput Math Methods Med] 2021 Sep 11; Vol. 2021, pp. 5940433. Date of Electronic Publication: 2021 Sep 11 (Print Publication: 2021).
Publication Year :
2021

Abstract

Wireless capsule endoscopy is a noninvasive wireless imaging technology that becomes increasingly popular in recent years. One of the major drawbacks of this technology is that it generates a large number of photos that must be analyzed by medical personnel, which takes time. Various research groups have proposed different image processing and machine learning techniques to classify gastrointestinal tract diseases in recent years. Traditional image processing algorithms and a data augmentation technique are combined with an adjusted pretrained deep convolutional neural network to classify diseases in the gastrointestinal tract from wireless endoscopy images in this research. We take advantage of pretrained models VGG16, ResNet-18, and GoogLeNet, a convolutional neural network (CNN) model with adjusted fully connected and output layers. The proposed models are validated with a dataset consisting of 6702 images of 8 classes. The VGG16 model achieved the highest results with 96.33% accuracy, 96.37% recall, 96.5% precision, and 96.5% F1-measure. Compared to other state-of-the-art models, the VGG16 model has the highest Matthews Correlation Coefficient value of 0.95 and Cohen's kappa score of 0.96.<br />Competing Interests: The authors declare that there is no conflict of interest regarding the publication of this article.<br /> (Copyright © 2021 J. Yogapriya et al.)

Details

Language :
English
ISSN :
1748-6718
Volume :
2021
Database :
MEDLINE
Journal :
Computational and mathematical methods in medicine
Publication Type :
Academic Journal
Accession number :
34545292
Full Text :
https://doi.org/10.1155/2021/5940433