Back to Search Start Over

PGD-UNet: A Position-Guided Deformable Network for Simultaneous Segmentation of Organs and Tumors

Authors :
Li, Ziqiang
Pan, Hong
Zhu, Yaping
Qin, A. K.
Publication Year :
2020

Abstract

Precise segmentation of organs and tumors plays a crucial role in clinical applications. It is a challenging task due to the irregular shapes and various sizes of organs and tumors as well as the significant class imbalance between the anatomy of interest (AOI) and the background region. In addition, in most situation tumors and normal organs often overlap in medical images, but current approaches fail to delineate both tumors and organs accurately. To tackle such challenges, we propose a position-guided deformable UNet, namely PGD-UNet, which exploits the spatial deformation capabilities of deformable convolution to deal with the geometric transformation of both organs and tumors. Position information is explicitly encoded into the network to enhance the capabilities of deformation. Meanwhile, we introduce a new pooling module to preserve position information lost in conventional max-pooling operation. Besides, due to unclear boundaries between different structures as well as the subjectivity of annotations, labels are not necessarily accurate for medical image segmentation tasks. It may cause the overfitting of the trained network due to label noise. To address this issue, we formulate a novel loss function to suppress the influence of potential label noise on the training process. Our method was evaluated on two challenging segmentation tasks and achieved very promising segmentation accuracy in both tasks.<br />Comment: Accepted by the 2020 International Joint Conference on Neural Networks (IJCNN 2020)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2007.01001
Document Type :
Working Paper