Back to Search
Start Over
Out-of-Distribution Detection for Deep Neural Networks With Isolation Forest and Local Outlier Factor
- Source :
- IEEE Access, Vol 9, Pp 132980-132989 (2021)
- Publication Year :
- 2021
- Publisher :
- IEEE, 2021.
-
Abstract
- Deep Neural Networks (DNNs) are extensively deployed in today’s safety-critical autonomous systems thanks to their excellent performance. However, they are known to make mistakes unpredictably, e.g., a DNN may misclassify an object if it is used for perception, or issue unsafe control commands if it is used for planning and control. One common cause for such unpredictable mistakes is Out-of-Distribution (OOD) input samples, i.e., samples that fall outside of the distribution of the training dataset. We present a framework for OOD detection based on outlier detection in one or more hidden layers of a DNN with a runtime monitor based on either Isolation Forest (IF) or Local Outlier Factor (LOF). Performance evaluation indicates that LOF is a promising method in terms of both the Machine Learning metrics of precision, recall, F1 score and accuracy, as well as computational efficiency during testing.
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 9
- Database :
- Directory of Open Access Journals
- Journal :
- IEEE Access
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.b9170f01c3e48b48d75989a9548f009
- Document Type :
- article
- Full Text :
- https://doi.org/10.1109/ACCESS.2021.3108451