Back to Search
Start Over
HSC: Leveraging horizontal shortcut connections for improving accuracy and computational efficiency of lightweight CNN
- Source :
- Neurocomputing. 457:141-154
- Publication Year :
- 2021
- Publisher :
- Elsevier BV, 2021.
-
Abstract
- The past few years have witnessed the dramatic increase in layers of convolutional neural networks (CNN). Most studies focused on the CNN’s vertical structure design (e.g. residual structure, creating short paths architecture from early layers to later layers in vertical connections), but few people pay their attention to the process of feature generation and extraction in a single convolutional layer in CNN. In this paper, we find the non-feature suppression phenomenon in the process of extracting features. On the basis of this, we proposed an orthogonal approach named HSC (Horizontal Shortcut Connections) to improve feature representation fusion and computational efficiency for CNN. Especially, our HSC approach can effectively reduce interference overhead of non-feature areas and enhance the information fusion for depthwise convolution and group convolution which are the key blocks in lightweight neuron network. At HSC layer, the feature-maps of all preceding layer are properly connected with our strategy in horizon direction to constitute features and then produce a new representation which are used as input feature-maps passed on subsequent layers. Our HSC block can be plugged into convolution neural networks that include group convolution or depewise convolution, and can effectively improve accuracy of convolutional networks with slight additional computational cost. We evaluate our design on the popular lightweight neural networks and standard CNN structure. Compared with existing methods, we can achieve 1.63% accuracy improvement for MobileNet v2 on CIFAR-10 dataset and up to 3.70% accuracy improvement on CIFAR-100 dataset by adding HSC block after depthwise convolution, and 2.80% accuracy improvement on ImageNet dataset. For Mobilenet v3-small, we can achieve 0.8% accuracy improvement on ImageNet dataset. In order to prove the improvement effect of group convolution, the standard convolution is changed manually to group convolution and then the HSC block is added after group convolution, we can achieve 4X to 6X FLOPs improvement while maintaining the accuracy of neural networks. Notably, on ILSVRC- 2012, our method reduces more than 43% FLOPs on ResNet-50 without accuracy declines and reduces 60.1% FLOPs on ResNet-50 with 0.44% accuracy declines.We also present primary hardware experiment results when HSC framework running on special hardware platform.
Details
- ISSN :
- 09252312
- Volume :
- 457
- Database :
- OpenAIRE
- Journal :
- Neurocomputing
- Accession number :
- edsair.doi...........f1627b943c64ab8ca1e07bc1b3830c00