1. Integration between constrained optimization and deep networks: a survey
- Author
-
Alice Bizzarri, Michele Fraccaroli, Evelina Lamma, and Fabrizio Riguzzi
- Subjects
deep learning ,symbolic artificial intelligence ,constrained training ,constrained neural architecture search ,neural-symbolic integration ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Integration between constrained optimization and deep networks has garnered significant interest from both research and industrial laboratories. Optimization techniques can be employed to optimize the choice of network structure based not only on loss and accuracy but also on physical constraints. Additionally, constraints can be imposed during training to enhance the performance of networks in specific contexts. This study surveys the literature on the integration of constrained optimization with deep networks. Specifically, we examine the integration of hyper-parameter tuning with physical constraints, such as the number of FLOPS (FLoating point Operations Per Second), a measure of computational capacity, latency, and other factors. This study also considers the use of context-specific knowledge constraints to improve network performance. We discuss the integration of constraints in neural architecture search (NAS), considering the problem as both a multi-objective optimization (MOO) challenge and through the imposition of penalties in the loss function. Furthermore, we explore various approaches that integrate logic with deep neural networks (DNNs). In particular, we examine logic-neural integration through constrained optimization applied during the training of NNs and the use of semantic loss, which employs the probabilistic output of the networks to enforce constraints on the output.
- Published
- 2024
- Full Text
- View/download PDF