Back to Search Start Over

Multi-organ segmentation of abdominal structures from non-contrast and contrast enhanced CT images.

Authors :
Yu C
Anakwenze CP
Zhao Y
Martin RM
Ludmir EB
S Niedzielski J
Qureshi A
Das P
Holliday EB
Raldow AC
Nguyen CM
Mumme RP
Netherton TJ
Rhee DJ
Gay SS
Yang J
Court LE
Cardenas CE
Source :
Scientific reports [Sci Rep] 2022 Nov 09; Vol. 12 (1), pp. 19093. Date of Electronic Publication: 2022 Nov 09.
Publication Year :
2022

Abstract

Manually delineating upper abdominal organs at risk (OARs) is a time-consuming task. To develop a deep-learning-based tool for accurate and robust auto-segmentation of these OARs, forty pancreatic cancer patients with contrast-enhanced breath-hold computed tomographic (CT) images were selected. We trained a three-dimensional (3D) U-Net ensemble that automatically segments all organ contours concurrently with the self-configuring nnU-Net framework. Our tool's performance was assessed on a held-out test set of 30 patients quantitatively. Five radiation oncologists from three different institutions assessed the performance of the tool using a 5-point Likert scale on an additional 75 randomly selected test patients. The mean (± std. dev.) Dice similarity coefficient values between the automatic segmentation and the ground truth on contrast-enhanced CT images were 0.80 ± 0.08, 0.89 ± 0.05, 0.90 ± 0.06, 0.92 ± 0.03, 0.96 ± 0.01, 0.97 ± 0.01, 0.96 ± 0.01, and 0.96 ± 0.01 for the duodenum, small bowel, large bowel, stomach, liver, spleen, right kidney, and left kidney, respectively. 89.3% (contrast-enhanced) and 85.3% (non-contrast-enhanced) of duodenum contours were scored as a 3 or above, which required only minor edits. More than 90% of the other organs' contours were scored as a 3 or above. Our tool achieved a high level of clinical acceptability with a small training dataset and provides accurate contours for treatment planning.<br /> (© 2022. The Author(s).)

Details

Language :
English
ISSN :
2045-2322
Volume :
12
Issue :
1
Database :
MEDLINE
Journal :
Scientific reports
Publication Type :
Academic Journal
Accession number :
36351987
Full Text :
https://doi.org/10.1038/s41598-022-21206-3