Back to Search Start Over

Mask Dynamic Routing to Combined Model of Deep Capsule Network and U-Net.

Authors :
Chen, Junying
Liu, Zhan
Source :
IEEE Transactions on Neural Networks & Learning Systems. Jul2020, Vol. 31 Issue 7, p2653-2664. 12p.
Publication Year :
2020

Abstract

The capsule network is a novel architecture to encode feature attributes and spatial relationships of an image. By using the dynamic routing (DR) algorithm, a capsule network (CapsNet) model can be trained. However, the original three-layer CapsNet with the DR algorithm performs poorly on complex data sets, such as FashionMNIST, CIFAR-10, and CIFAR-100. This deficiency limits the wider application of capsule networks. In this article, we propose a deep capsule network model combined with a U-Net preprocessing module (DCN-UN). Local connection and weight-sharing strategies are adopted from convolutional neural networks to design a convolutional capsule layer in the DCN-UN model. This allows considerably reducing the number of parameters in the network model. Moreover, a greedy strategy is incorporated into the design of a mask DR (MDR) algorithm to improve the performance of network models. DCN-UN requires up to five times fewer parameters compared with the original CapsNet and other CapsNet-based models. The performance improvement achieved by the DCN-UN model with the MDR algorithm over the original CapsNet model with the DR algorithm is approximately 12% and 17% on the CIFAR-10 and CIFAR-100 data sets, respectively. The experimental results confirm that the proposed DCN-UN model allows preserving advantages of image reconstruction and equivariance mechanism in capsule networks. Moreover, an efficient initialization method is explored to enhance training stability and avoid gradient explosion. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
31
Issue :
7
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
144568178
Full Text :
https://doi.org/10.1109/TNNLS.2020.2984686