Back to Search
Start Over
Learning Deep Neural Network Controllers for Dynamical Systems with Safety Guarantees: Invited Paper
- Source :
- ICCAD
- Publication Year :
- 2019
- Publisher :
- IEEE, 2019.
-
Abstract
- There is recent interest in using deep neural networks (DNNs) for controlling autonomous cyber-physical systems (CPSs). One challenge with this approach is that many autonomous CPS applications are safety-critical, and is not clear if DNNs can proffer safe system behaviors. To address this problem, we present an approach to modify existing (deep) reinforcement learning algorithms to guide the training of those controllers so that the overall system is safe. We present a novel verification-in-the-loop training algorithm that uses the formalism of barrier certificates to synthesize DNN-controllers that are safe by design. We demonstrate a proof-of-concept evaluation of our technique on multiple CPS examples.
- Subjects :
- 0209 industrial biotechnology
Dynamical systems theory
Artificial neural network
business.industry
Computer science
0102 computer and information sciences
02 engineering and technology
01 natural sciences
Formalism (philosophy of mathematics)
020901 industrial engineering & automation
010201 computation theory & mathematics
Reinforcement learning
Artificial intelligence
business
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2019 IEEE/ACM International Conference on Computer-Aided Design (ICCAD)
- Accession number :
- edsair.doi...........1c970f8130fb7bbaec4d74b1e5c9bfff