Back to Search
Start Over
Adversarially Robust One-Class Novelty Detection.
- Source :
-
IEEE transactions on pattern analysis and machine intelligence [IEEE Trans Pattern Anal Mach Intell] 2023 Apr; Vol. 45 (4), pp. 4167-4179. Date of Electronic Publication: 2023 Mar 07. - Publication Year :
- 2023
-
Abstract
- One-class novelty detectors are trained with examples of a particular class and are tasked with identifying whether a query example belongs to the same known class. Most recent advances adopt a deep auto-encoder style architecture to compute novelty scores for detecting novel class data. Deep networks have shown to be vulnerable to adversarial attacks, yet little focus is devoted to studying the adversarial robustness of deep novelty detectors. In this article, we first show that existing novelty detectors are susceptible to adversarial examples. We further demonstrate that commonly-used defense approaches for classification tasks have limited effectiveness in one-class novelty detection. Hence, we need a defense specifically designed for novelty detection. To this end, we propose a defense strategy that manipulates the latent space of novelty detectors to improve the robustness against adversarial examples. The proposed method, referred to as Principal Latent Space (PrincipaLS), learns the incrementally-trained cascade principal components in the latent space to robustify novelty detectors. PrincipaLS can purify latent space against adversarial examples and constrain latent space to exclusively model the known class distribution. We conduct extensive experiments on eight attacks, five datasets and seven novelty detectors, showing that PrincipaLS consistently enhances the adversarial robustness of novelty detection models.
Details
- Language :
- English
- ISSN :
- 1939-3539
- Volume :
- 45
- Issue :
- 4
- Database :
- MEDLINE
- Journal :
- IEEE transactions on pattern analysis and machine intelligence
- Publication Type :
- Academic Journal
- Accession number :
- 35816537
- Full Text :
- https://doi.org/10.1109/TPAMI.2022.3189638