1. FreePrune: An Automatic Pruning Framework Across Various Granularities Based on Training-Free Evaluation
- Author
-
Tang, Miao, Liu, Ning, Yang, Tao, Fang, Haining, Lin, Qiu, Tan, Yujuan, Chen, Xianzhang, Liu, Duo, Zhong, Kan, and Ren, Ao
- Abstract
Network pruning is an effective technique that reduces the computational costs of networks while maintaining accuracy. However, pruning requires expert knowledge and hyperparameter tuning, such as determining the pruning rate for each layer. Automatic pruning methods address this challenge by proposing an effective training-free metric to quickly evaluate the pruned network without fine-tuning. However, most existing automatic pruning methods only investigate a certain pruning granularity, and it remains unclear whether metrics benefit automatic pruning at different granularities. Neural architecture search also studies training-free metrics to accelerate network generation. Nevertheless, whether they apply to pruning needs further investigation. In this study, we first systematically analyze various advanced training-free metrics for various granularities in pruning, and then we investigate the correlation between the training-free metric score and the after-fine-tuned model accuracy. Based on the analysis, we proposed FreePrune score, a more general metric compatible with all pruning granularities. Aiming at generating high-quality pruned networks and unleashing the power of FreePrune score, we further propose FreePrune, an automatic framework that can rapidly generate and evaluate the candidate networks, leading to a final pruned network with both high accuracy and pruning rate. Experiments show that our method achieves high correlation on various pruning granularities and comprehensively improves the accuracy.
- Published
- 2024
- Full Text
- View/download PDF