Back to Search
Start Over
MSAA-Net: a multi-scale attention-aware U-Net is used to segment the liver.
- Source :
- Signal, Image & Video Processing; Jun2023, Vol. 17 Issue 4, p1001-1009, 9p
- Publication Year :
- 2023
-
Abstract
- Automatic segmentation of the liver from CT images is a very challenging task because the shape of the liver in the abdominal cavity varies from person to person and it also often fits closely with other organs. In recent years, with the continuous development of deep learning and the proposal of CNN, the neural network-based segmentation models have shown good performance in the field of image segmentation. Among the many network models, U-Net stands out in the task of medical image segmentation. In this paper, we propose a segmentation network MSAA-Net combining multi-scale features and an improved attention-aware U-Net. We extracted features of different scales on a single feature layer and performed attention perception in the channel dimension. We demonstrate that this architecture improves the performance of U-Net, while significantly reducing computational costs. To address the problem that U-Net's skip connection is difficult to optimize for merging objects of different sizes, we designed a multi-scale attention gate structure (MAG), which allows the model to automatically learn to focus on targets of different sizes. In addition, MAG can be extended to all structures which contain skip connections, such as U-Net and FCN variants. Our structure was extensively evaluated on the 3Dircadb dataset, and the DICE similarity coefficient of the method for the liver segmentation task was 94.42%, with a much smaller number of model parameters than other attentional models. The experimental results show that MSAA-Net achieves very competitive performance in liver segmentation. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 18631703
- Volume :
- 17
- Issue :
- 4
- Database :
- Complementary Index
- Journal :
- Signal, Image & Video Processing
- Publication Type :
- Academic Journal
- Accession number :
- 163294325
- Full Text :
- https://doi.org/10.1007/s11760-022-02305-0