1. Tuning Deep Neural Network’s Hyperparameters Constrained to Deployability on Tiny Systems
- Author
-
Francesco Archetti, Riccardo Perego, Danilo Pau, Antonio Candelieri, Farkaš I., Masulli P., Wermter S., Perego, R, Candelieri, A, Archetti, F, and Pau, D
- Subjects
Hyperparameter ,Structure (mathematical logic) ,Artificial neural network ,Computer science ,business.industry ,Deep learning ,Distributed computing ,Bayesian optimization ,Neural Architecture Search ,020206 networking & telecommunications ,Deep Neural Network ,02 engineering and technology ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,020201 artificial intelligence & image processing ,Artificial intelligence ,Hyperparameters optimization ,business - Abstract
Deep Neural Networks are increasingly deployed on tiny systems such as microcontrollers or embedded systems. Notwithstanding the recent success of Deep Learning, also enabled by the availability of Automated Machine Learning and Neural Architecture Search solutions, the computational requirements of the optimization of the structure and the hyperparameters of Deep Neural Networks usually far exceed what is available on tiny systems. Therefore, the deployability becomes critical when the learned model must be deployed on a tiny system. To overcome this critical issue, we propose a framework, based on Bayesian Optimization, to optimize the hyperparameters of a Deep Neural Network by dealing with black-box deployability constraints. Encouraging results obtained on a classification benchmark problem on a real microcontroller by STMicroelectronics are presented.
- Published
- 2020