Back to Search
Start Over
A novel effective and efficient capsule network via bottleneck residual block and automated gradual pruning.
- Source :
-
Computers & Electrical Engineering . Dec2019, Vol. 80, pN.PAG-N.PAG. 1p. - Publication Year :
- 2019
-
Abstract
- Capsule Network (CapsNet) complements the invariance properties of the convolutional neural network with equivariance through pose estimation. While CapsNet achieves a very decent performance with a shallow architecture, it suffers from heavily parameters learning in the primary capsule layer, and has a poor performance on large or complex datasets. To tackle these problems, this paper benefits CapsNet from two techniques of model compression: bottleneck residual block and automated gradual pruning. Specifically, this paper designs an end-to-end framework, denoted as Deft Capsule (DCaps). In order to reduce the number of need-to-learn parameters, this paper applies the bottleneck residual blocks to the primary capsule layer. Furthermore, this paper obtains automated gradual pruning techniques in dynamic routing to improve the performance of DCaps. Experiment results on four classic datasets demonstrate that DCaps learns fast and significantly outperforms CapsNet in image classification tasks. [ABSTRACT FROM AUTHOR]
- Subjects :
- *PRUNING
*ARTIFICIAL neural networks
Subjects
Details
- Language :
- English
- ISSN :
- 00457906
- Volume :
- 80
- Database :
- Academic Search Index
- Journal :
- Computers & Electrical Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 139748164
- Full Text :
- https://doi.org/10.1016/j.compeleceng.2019.106481