1. Convolutional neural network-based classification of craniosynostosis and suture lines from multi-view cranial X-rays.
- Author
-
Kim SM, Yang JS, Han JW, Koo HI, Roh TH, and Yoon SH
- Subjects
- Humans, Infant, Deep Learning, Male, Female, X-Rays, Image Processing, Computer-Assisted methods, Skull diagnostic imaging, Craniosynostoses diagnostic imaging, Craniosynostoses classification, Cranial Sutures diagnostic imaging, Neural Networks, Computer
- Abstract
Early and precise diagnosis of craniosynostosis (CSO), which involves premature fusion of cranial sutures in infants, is crucial for effective treatment. Although computed topography offers detailed imaging, its high radiation poses risks, especially to children. Therefore, we propose a deep-learning model for CSO and suture-line classification using 2D cranial X-rays that minimises radiation-exposure risks and offers reliable diagnoses. We used data comprising 1,047 normal and 277 CSO cases from 2006 to 2023. Our approach integrates X-ray-marker removal, head-pose standardisation, skull-cropping, and fine-tuning modules for CSO and suture-line classification using convolution neural networks (CNNs). It enhances the diagnostic accuracy and efficiency of identifying CSO from X-ray images, offering a promising alternative to traditional methods. Four CNN backbones exhibited robust performance, with F1-scores exceeding 0.96 and sensitivity and specificity exceeding 0.9, proving the potential for clinical applications. Additionally, preprocessing strategies further enhanced the accuracy, demonstrating the highest F1-scores, precision, and specificity. A qualitative analysis using gradient-weighted class activation mapping illustrated the focal points of the models. Furthermore, the suture-line classification model distinguishes five suture lines with an accuracy of > 0.9. Thus, the proposed approach can significantly reduce the time and labour required for CSO diagnosis, streamlining its management in clinical settings., (© 2024. The Author(s).)
- Published
- 2024
- Full Text
- View/download PDF