Back to Search Start Over

A modified U-Net convolutional neural network for segmenting periprostatic adipose tissue based on contour feature learning

Authors :
Gang Wang
Jinyue Hu
Yu Zhang
Zhaolin Xiao
Mengxing Huang
Zhanping He
Jing Chen
Zhiming Bai
Source :
Heliyon, Vol 10, Iss 3, Pp e25030- (2024)
Publication Year :
2024
Publisher :
Elsevier, 2024.

Abstract

Objective: This study trains a U-shaped fully convolutional neural network (U-Net) model based on peripheral contour measures to achieve rapid, accurate, automated identification and segmentation of periprostatic adipose tissue (PPAT). Methods: Currently, no studies are using deep learning methods to discriminate and segment periprostatic adipose tissue. This paper proposes a novel and modified, U-shaped convolutional neural network contour control points on a small number of datasets of MRI T2W images of PPAT combined with its gradient images as a feature learning method to reduce feature ambiguity caused by the differences in PPAT contours of different patients. This paper adopts a supervised learning method on the labeled dataset, combining the probability and spatial distribution of control points, and proposes a weighted loss function to optimize the neural network's convergence speed and detection performance. Based on high-precision detection of control points, this paper uses a convex curve fitting to obtain the final PPAT contour. The imaging segmentation results were compared with those of a fully convolutional network (FCN), U-Net, and semantic segmentation convolutional network (SegNet) on three evaluation metrics: Dice similarity coefficient (DSC), Hausdorff distance (HD), and intersection over union ratio (IoU). Results: Cropped images with a 270 × 270-pixel matrix had DSC, HD, and IoU values of 70.1%, 27 mm, and 56.1%, respectively; downscaled images with a 256 × 256-pixel matrix had 68.7%, 26.7 mm, and 54.1%. A U-Net network based on peripheral contour characteristics predicted the complete periprostatic adipose tissue contours on T2W images at different levels. FCN, U-Net, and SegNet could not completely predict them. Conclusion: This U-Net convolutional neural network based on peripheral contour features can identify and segment periprostatic adipose tissue quite well. Cropped images with a 270 × 270-pixel matrix are more appropriate for use with the U-Net convolutional neural network based on contour features; reducing the resolution of the original image will lower the accuracy of the U-Net convolutional neural network. FCN and SegNet are not appropriate for identifying PPAT on T2 sequence MR images. Our method can automatically segment PPAT rapidly and accurately, laying a foundation for PPAT image analysis.

Details

Language :
English
ISSN :
24058440
Volume :
10
Issue :
3
Database :
Directory of Open Access Journals
Journal :
Heliyon
Publication Type :
Academic Journal
Accession number :
edsdoj.70447013a8c44abab4d8cb38f441dc31
Document Type :
article
Full Text :
https://doi.org/10.1016/j.heliyon.2024.e25030