Back to Search Start Over

Filter Pruning via Measuring Feature Map Information

Authors :
Linsong Shao
Haorui Zuo
Jianlin Zhang
Zhiyong Xu
Jinzhen Yao
Zhixing Wang
Hong Li
Source :
Sensors, Vol 21, Iss 19, p 6601 (2021)
Publication Year :
2021
Publisher :
MDPI AG, 2021.

Abstract

Neural network pruning, an important method to reduce the computational complexity of deep models, can be well applied to devices with limited resources. However, most current methods focus on some kind of information about the filter itself to prune the network, rarely exploring the relationship between the feature maps and the filters. In this paper, two novel pruning methods are proposed. First, a new pruning method is proposed, which reflects the importance of filters by exploring the information in the feature maps. Based on the premise that the more information there is, more important the feature map is, the information entropy of feature maps is used to measure information, which is used to evaluate the importance of each filter in the current layer. Further, normalization is used to realize cross layer comparison. As a result, based on the method mentioned above, the network structure is efficiently pruned while its performance is well reserved. Second, we proposed a parallel pruning method using the combination of our pruning method above and slimming pruning method which has better results in terms of computational cost. Our methods perform better in terms of accuracy, parameters, and FLOPs compared to most advanced methods. On ImageNet, it is achieved 72.02% top1 accuracy for ResNet50 with merely 11.41M parameters and 1.12B FLOPs.For DenseNet40, it is obtained 94.04% accuracy with only 0.38M parameters and 110.72M FLOPs on CIFAR10, and our parallel pruning method makes the parameters and FLOPs are just 0.37M and 100.12M, respectively, with little loss of accuracy.

Details

Language :
English
ISSN :
14248220
Volume :
21
Issue :
19
Database :
Directory of Open Access Journals
Journal :
Sensors
Publication Type :
Academic Journal
Accession number :
edsdoj.0a87acad4fa84ca49fbfb7bc568aaa0f
Document Type :
article
Full Text :
https://doi.org/10.3390/s21196601