Back to Search Start Over

Joint filter and channel pruning of convolutional neural networks as a bi-level optimization problem.

Authors :
Louati, Hassen
Louati, Ali
Bechikh, Slim
Kariri, Elham
Source :
Memetic Computing; Mar2024, Vol. 16 Issue 1, p71-90, 20p
Publication Year :
2024

Abstract

Deep neural networks, specifically deep convolutional neural networks (DCNNs), have been highly successful in machine learning and computer vision, but a significant challenge when using these networks is choosing the right hyperparameters. As the number of layers in the network increases, the search space also becomes larger. To overcome this issue, researchers in deep learning have suggested using deep compression techniques to decrease memory usage and computational complexity. In this paper, we present a new approach for compressing deep CNNs by combining filter and channel pruning methods based on Evolutionary Algorithms (EA). This method involves eliminating filters and channels in order to decrease the number of parameters and computational complexity of the model. Additionally, we propose a bi-level optimization problem that interacts between the hyperparameters of the convolution layer. Bi-level optimization problems are known to be difficult as they involve two levels of optimization tasks, where only the optimal solutions to the lower-level problem are considered as feasible candidates for the upper-level problem. In this work, the upper-level problem is represented by a set of filters to be pruned in order to minimize the number of selected filters, while the lower-level problem is represented by a set of channels to be pruned in order to minimize the number of selected channels per filter. Our research has focused on developing a new method for solving bi-level problems, which we have named Bi-CNN-Pruning. To achieve this, we have adopted the Co-Evolutionary Migration-Based Algorithm (CEMBA) as our search engine. The Bi-CNN-Pruning method is then evaluated using image classification benchmarks on well-known datasets such as CIFAR-10 and CIFAR-100. The results of our evaluation demonstrate that our bi-level proposal outperforms state-of-the-art architectures, and we provide a detailed analysis of the results using commonly employed performance metrics. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
18659284
Volume :
16
Issue :
1
Database :
Complementary Index
Journal :
Memetic Computing
Publication Type :
Academic Journal
Accession number :
176032544
Full Text :
https://doi.org/10.1007/s12293-024-00406-6