Back to Search Start Over

Interspace Pruning: Using Adaptive Filter Representations to Improve Training of Sparse CNNs

Authors :
Wimmer, Paul
Mehnert, Jens
Condurache, Alexandru Paul
Publication Year :
2022

Abstract

Unstructured pruning is well suited to reduce the memory footprint of convolutional neural networks (CNNs), both at training and inference time. CNNs contain parameters arranged in $K \times K$ filters. Standard unstructured pruning (SP) reduces the memory footprint of CNNs by setting filter elements to zero, thereby specifying a fixed subspace that constrains the filter. Especially if pruning is applied before or during training, this induces a strong bias. To overcome this, we introduce interspace pruning (IP), a general tool to improve existing pruning methods. It uses filters represented in a dynamic interspace by linear combinations of an underlying adaptive filter basis (FB). For IP, FB coefficients are set to zero while un-pruned coefficients and FBs are trained jointly. In this work, we provide mathematical evidence for IP's superior performance and demonstrate that IP outperforms SP on all tested state-of-the-art unstructured pruning methods. Especially in challenging situations, like pruning for ImageNet or pruning to high sparsity, IP greatly exceeds SP with equal runtime and parameter costs. Finally, we show that advances of IP are due to improved trainability and superior generalization ability.<br />Comment: Accepted as conference paper for CVPR 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2203.07808
Document Type :
Working Paper