Back to Search Start Over

PREMAP: A Unifying PREiMage APproximation Framework for Neural Networks

Authors :
Zhang, Xiyue
Wang, Benjie
Kwiatkowska, Marta
Zhang, Huan
Publication Year :
2024

Abstract

Most methods for neural network verification focus on bounding the image, i.e., set of outputs for a given input set. This can be used to, for example, check the robustness of neural network predictions to bounded perturbations of an input. However, verifying properties concerning the preimage, i.e., the set of inputs satisfying an output property, requires abstractions in the input space. We present a general framework for preimage abstraction that produces under- and over-approximations of any polyhedral output set. Our framework employs cheap parameterised linear relaxations of the neural network, together with an anytime refinement procedure that iteratively partitions the input region by splitting on input features and neurons. The effectiveness of our approach relies on carefully designed heuristics and optimization objectives to achieve rapid improvements in the approximation volume. We evaluate our method on a range of tasks, demonstrating significant improvement in efficiency and scalability to high-input-dimensional image classification tasks compared to state-of-the-art techniques. Further, we showcase the application to quantitative verification and robustness analysis, presenting a sound and complete algorithm for the former and providing sound quantitative results for the latter.<br />Comment: arXiv admin note: text overlap with arXiv:2305.03686

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2408.09262
Document Type :
Working Paper