1. Evaluating segment anything model (SAM) on MRI scans of brain tumors.
- Author
-
Ali, Luqman, Alnajjar, Fady, Swavaf, Muhammad, Elharrouss, Omar, Abd-alrazaq, Alaa, and Damseh, Rafat
- Abstract
Addressing the challenge of automatically segmenting anatomical structures from brain images has been a long-standing problem, attributed to subject- and image-based variations and constraints in available data annotations. The Segment Anything Model (SAM), developed by Meta, is a foundational model trained to provide zero-shot segmentation outputs with or without interactive user inputs, demonstrating notable performance on various objects and image domains without explicit prior training. This study evaluated SAM’s performance in brain tumor segmentation using two publicly available Magnetic Resonance Imaging (MRI) datasets. The study analyzed SAM’s standalone segmentation as well as its performance when provided user interaction through point prompts and bounding box inputs. SAM exhibited versatility across configurations and datasets, with the bounding box consistently outperforming others in achieving superior localized precision, with average Dice scores of 0.68 for TCGA and 0.56 for BRATS, along with average IoU values of 0.89 and 0.65, respectively, especially for tumors with low-to-medium curvature. Inconsistencies were observed, particularly in relation to variations in tumor size, shape, and textural features. The conclusion drawn from the study is that while SAM can automate medical image segmentation, further training and careful implementation are necessary for diagnostic purposes, especially with challenging cases such as MRI scans of brain tumors. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF