Back to Search
Start Over
PP-NAS: Searching for Plug-and-Play Blocks on Convolutional Neural Networks
- Source :
- IEEE Transactions on Neural Networks and Learning Systems; September 2024, Vol. 35 Issue: 9 p12718-12730, 13p
- Publication Year :
- 2024
-
Abstract
- Multiscale features are of great importance in modern convolutional neural networks, showing consistent performance gains on numerous vision tasks. Therefore, many plug-and-play blocks are introduced to upgrade existing convolutional neural networks for stronger multiscale representation ability. However, the design of plug-and-play blocks is getting more and more complex, and these manually designed blocks are not optimal. In this work, we propose PP-NAS to develop plug-and-play blocks based on neural architecture search (NAS). Specifically, we design a new search space PPConv and develop a search algorithm consisting of one-level optimization, zero-one loss, and connection existence loss. PP-NAS minimizes the optimization gap between super-net and subarchitectures and can achieve good performance even without retraining. Extensive experiments on image classification, object detection, and semantic segmentation verify the superiority of PP-NAS over state-of-the-art CNNs (e.g., ResNet, ResNeXt, and Res2Net). Our code is available at <uri>https://github.com/ainieli/PP-NAS</uri>.
Details
- Language :
- English
- ISSN :
- 2162237x and 21622388
- Volume :
- 35
- Issue :
- 9
- Database :
- Supplemental Index
- Journal :
- IEEE Transactions on Neural Networks and Learning Systems
- Publication Type :
- Periodical
- Accession number :
- ejs67330680
- Full Text :
- https://doi.org/10.1109/TNNLS.2023.3264551