1. Automatic quantitative evaluation of normal pancreas based on deep learning in a Chinese adult population
- Author
-
Jinxiu Cai, Xiaochao Guo, Ke Wang, Yaofeng Zhang, Dadou Zhang, Xiaodong Zhang, and Xiaoying Wang
- Subjects
China ,Deep Learning ,Radiological and Ultrasound Technology ,Urology ,Image Processing, Computer-Assisted ,Gastroenterology ,Humans ,Radiology, Nuclear Medicine and imaging ,Pancreas ,Retrospective Studies - Abstract
To develop a 3D U-Net-based model for the automatic segmentation of the pancreas using the diameters, volume, and density of normal pancreases among Chinese adults.A total of 2778 pancreas images (dataset 1) were retrospectively collected and randomly divided into training (n = 2252), validation (n = 245), and test (n = 281) datasets. The segmentation model for the pancreas was constructed through cascaded application of two 3D U-Net networks. The segmentation efficiency for the pancreas was evaluated by the Dice similarity coefficient (DSC). Another dataset of 3189 normal pancreas CT images (dataset 2) was obtained for external validation, including 1063 non-contrast images, 1063 arterial phase images, and 1063 portal venous phase images. The pancreas segmentation in dataset 2 was assessed objectively and manually revised by two radiologists. Then, the pancreatic volume, diameters, and average CT value for each phase of pancreas images in dataset 2 were calculated. The relationships between pancreas volume and age, sex, height, and weight were analyzed.In dataset 1, a mean DSC of 0.94 for the test dataset was achieved. In dataset 2, the objective assessment yielded a 90% satisfaction rate for the automatic segmentation of the pancreas as external validation. The diameters of the pancreas were 43.71-44.28 mm, 67.40-68.15 mm, and 114.53-117.06 mm, respectively. The average pancreatic volume was 63,969.06-65,247.75 mmThe pancreas segmentation tool based on deep learning can segment the pancreas on CT images and measure its normal diameter, volume, and CT value accurately and effectively.
- Published
- 2022
- Full Text
- View/download PDF