1. MOO-DNAS: Efficient Neural Network Design via Differentiable Architecture Search Based on Multi-Objective Optimization
- Author
-
Hui Wei, Feifei Lee, Chunyan Hu, and Qiu Chen
- Subjects
Differentiable neural architecture search ,CNNs ,multi-objective optimization ,accuracy-efficiency trade-off ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The progress devoted to improving the performance of neural networks has come at a high price in terms of cost and experience. Fortunately, the emergence of Neural Architecture Search improves the speed of network design, but most excellent works only optimize for high accuracy without penalizing the model complexity. In this paper, we propose an efficient CNN architecture search framework, MOO-DNAS, with multi-objective optimization based on differentiable neural architecture search. The main goal is to trade off two competing objectives, classification accuracy and network latency, so that the search algorithm is able to discover an efficient model while maintaining high accuracy. In order to achieve a better implementation, we construct a novel factorized hierarchical search space to support layer variety and hardware friendliness. Furthermore, a robust sampling strategy named “hard-sampling” is proposed to obtain final structures with higher average performance by keeping the highest scoring operator. Experimental results on the benchmark datasets MINST, CIFAR10 and CIFAR100 demonstrate the effectiveness of the proposed method. The searched architectures, MOO-DNAS-Nets, achieve advanced accuracy with fewer parameters and FLOPs, and the search cost is less than one GPU-day.
- Published
- 2022
- Full Text
- View/download PDF