1. CNN Pruning with Multi-Stage Feature Decorrelation.
- Author
-
Zhu, Qiuyu and Liu, Chengfei
- Subjects
- *
CONVOLUTIONAL neural networks - Abstract
This paper proposes a channel pruning method based on multi-stage feature de-correlation to obtain a more efficient convolutional neural network (CNN) model. Based on the correlation of hidden features at each level of the network, we refine more efficient features of each convolutional layer by applying feature de-correlation constraints (MFD Loss) to each convolutional layer of the network and then prune channels according to the modulus of the feature maps output from each layer. After several rounds of pruning and fine-tuning, a network with similar accuracy, a substantially smaller network size, and more efficient operation is generated compared to the original model. Our experiments on pruning various popular CNN models on many standard datasets demonstrate the method’s effectiveness. Specifically, for VGG-16 on CIFAR10, our approach eliminates parameters by 97.0%, saves Float-Point-Operations (FLOPs) by 66.9%, with a 0.4% accuracy gain and state-of-art performance. For ResNet-50 on the ImageNet dataset, our method eliminates parameters by 30.0%, and saves FLOPs by 52%, with 1.4% accuracy loss, which also proves the effectiveness of the method. The code for the paper can be found at https://github.com/lovelyemperor/MFD. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF