Back to Search Start Over

Development of Chest X-ray Image Evaluation Software Using the Deep Learning Techniques.

Authors :
Usui, Kousuke
Yoshimura, Takaaki
Ichikawa, Shota
Sugimori, Hiroyuki
Source :
Applied Sciences (2076-3417); Jun2023, Vol. 13 Issue 11, p6695, 18p
Publication Year :
2023

Abstract

Although the widespread use of digital imaging has enabled real-time image display, images in chest X-ray examinations can be confirmed by the radiologist's eyes. Considering the development of deep learning (DL) technology, its application will make it possible to immediately determine the need for a retake, which is expected to further improve examination throughput. In this study, we developed software for evaluating chest X-ray images to determine whether a repeat radiographic examination is necessary, based on the combined application of DL technologies, and evaluated its accuracy. The target population was 4809 chest images from a public database. Three classification models (CLMs) for lung field defects, obstacle shadows, and the location of obstacle shadows and a semantic segmentation model (SSM) for the lung field regions were developed using a fivefold cross validation. The CLM was evaluated using the overall accuracy in the confusion matrix, the SSM was evaluated using the mean intersection over union (mIoU), and the DL technology-combined software was evaluated using the total response time on this software (RT) per image for each model. The results of each CLM with respect to lung field defects, obstacle shadows, and obstacle shadow location were 89.8%, 91.7%, and 91.2%, respectively. The mIoU of the SSM was 0.920, and the software RT was 3.64 × 10<superscript>−2</superscript> s. These results indicate that the software can immediately and accurately determine whether a chest image needs to be re-scanned. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20763417
Volume :
13
Issue :
11
Database :
Complementary Index
Journal :
Applied Sciences (2076-3417)
Publication Type :
Academic Journal
Accession number :
164213976
Full Text :
https://doi.org/10.3390/app13116695