Back to Search Start Over

Sparse polynomial optimisation for neural network verification.

Authors :
Newton, Matthew
Papachristodoulou, Antonis
Source :
Automatica. Nov2023, Vol. 157, pN.PAG-N.PAG. 1p.
Publication Year :
2023

Abstract

The prevalence of neural networks in applications is expanding at an increasing rate. It is becoming clear that providing robust guarantees on systems that use neural networks is very important, especially in safety-critical applications. A trained neural network's sensitivity to adversarial attacks is one of its greatest shortcomings. To provide robust guarantees, one popular method that has seen success is to describe and bound the activation functions in the neural network using equality and inequality constraints. However, there are numerous ways to form these bounds, providing a trade-off between conservativeness and complexity. We approach the problem from a different perspective, using sparse polynomial optimisation theory and Positivstellensatz, a key result in real algebraic geometry. The former exploits the natural cascading structure of the neural network using ideas from chordal sparsity while the latter tests the emptiness of a semi-algebraic set using algebra, to provide tight bounds. We show that bounds can be tightened significantly, whilst the computational time remains reasonable. We compare the solve times of different solvers and show how accuracy can be improved at the expense of increased computation time. In particular, we show that with the sparse polynomial framework the solve time and accuracy can be improved over other methods for neural network verification, for networks with ReLU, sigmoid and tanh activation functions. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00051098
Volume :
157
Database :
Academic Search Index
Journal :
Automatica
Publication Type :
Academic Journal
Accession number :
171922089
Full Text :
https://doi.org/10.1016/j.automatica.2023.111233