Back to Search Start Over

Synthetic Skull CT Generation with Generative Adversarial Networks to Train Deep Learning Models for Clinical Transcranial Ultrasound

Authors :
Naftchi-Ardebili, Kasra
Singh, Karanpartap
Pourabolghasem, Reza
Ghanouni, Pejman
Popelka, Gerald R.
Pauly, Kim Butts
Publication Year :
2023

Abstract

Deep learning offers potential for various healthcare applications, yet requires extensive datasets of curated medical images where data privacy, cost, and distribution mismatch across various acquisition centers could become major problems. To overcome these challenges, we propose a generative adversarial network (SkullGAN) to create large datasets of synthetic skull CT slices, geared towards training models for transcranial ultrasound. With wide ranging applications in treatment of essential tremor, Parkinson's, and Alzheimer's disease, transcranial ultrasound clinical pipelines can be significantly optimized via integration of deep learning. The main roadblock is the lack of sufficient skull CT slices for the purposes of training, which SkullGAN aims to address. Actual CT slices of 38 healthy subjects were used for training. The generated synthetic skull images were then evaluated based on skull density ratio, mean thickness, and mean intensity. Their fidelity was further analyzed using t-distributed stochastic neighbor embedding (t-SNE), Fr\'echet inception distance (FID) score, and visual Turing test (VTT) taken by four staff clinical radiologists. SkullGAN-generated images demonstrated similar quantitative radiological features to real skulls. t-SNE failed to separate real and synthetic samples from one another, and the FID score was 49. Expert radiologists achieved a 60\% mean accuracy on the VTT. SkullGAN makes it possible for researchers to generate large numbers of synthetic skull CT segments, necessary for training neural networks for medical applications involving the human skull, such as transcranial focused ultrasound, mitigating challenges with access, privacy, capital, time, and the need for domain expertise.<br />Comment: The first two authors contributed equally

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2308.00206
Document Type :
Working Paper