1. Breast Cancer Detection on Dual-View Sonography via Data-Centric Deep Learning
- Author
-
Ting-Ruen Wei, Michele Hell, Aren Vierra, Ran Pang, Young Kang, Mahesh Patel, and Yuling Yan
- Subjects
Breast cancer classification ,deep learning ,radiologist comparison ,Computer applications to medicine. Medical informatics ,R858-859.7 ,Medical technology ,R855-855.5 - Abstract
Goal: This study aims to enhance AI-assisted breast cancer diagnosis through dual-view sonography using a data-centric approach. Methods: We customize a DenseNet-based model on our exclusive dual-view breast ultrasound dataset to enhance the model's ability to differentiate between malignant and benign masses. Various assembly strategies are designed to integrate the dual views into the model input, contrasting with the use of single views alone, with a goal to maximize performance. Subsequently, we compare the model against the radiologist and quantify the improvement in key performance metrics. We further assess how the radiologist's diagnostic accuracy is enhanced with the assistance of the model. Results: Our experiments consistently found that optimal outcomes were achieved by using a channel-wise stacking approach incorporating both views, with one duplicated as the third channel. This configuration resulted in remarkable model performance with an area underthe receiver operating characteristic curve (AUC) of 0.9754, specificity of 0.96, and sensitivity of 0.9263, outperforming the radiologist by 50% in specificity. With the model's guidance, the radiologist's performance improved across key metrics: accuracy by 17%, precision by 26%, and specificity by 29%. Conclusions: Our customized model, withan optimal configuration for dual-view image input, surpassed both radiologists and existing model results in the literature. Integrating the model as a standalone tool or assistive aid for radiologists can greatly enhance specificity, reduce false positives, thereby minimizing unnecessary biopsies and alleviating radiologists' workload.
- Published
- 2025
- Full Text
- View/download PDF