Back to Search
Start Over
Saliency Attack: Towards Imperceptible Black-box Adversarial Attack
- Source :
- ACM Transactions on Intelligent Systems and Technology. 14:1-20
- Publication Year :
- 2023
- Publisher :
- Association for Computing Machinery (ACM), 2023.
-
Abstract
- Deep neural networks are vulnerable to adversarial examples, even in the black-box setting where the attacker is only accessible to the model output. Recent studies have devised effective black-box attacks with high query efficiency. However, such performance is often accompanied by compromises in attack imperceptibility, hindering the practical use of these approaches. In this article, we propose to restrict the perturbations to a small salient region to generate adversarial examples that can hardly be perceived. This approach is readily compatible with many existing black-box attacks and can significantly improve their imperceptibility with little degradation in attack success rates. Furthermore, we propose the Saliency Attack, a new black-box attack aiming to refine the perturbations in the salient region to achieve even better imperceptibility. Extensive experiments show that compared to the state-of-the-art black-box attacks, our approach achieves much better imperceptibility scores, including most apparent distortion (MAD), L 0 and L 2 distances, and also obtains significantly better true success rate and effective query number judged by a human-like threshold on MAD. Importantly, the perturbations generated by our approach are interpretable to some extent. Finally, it is also demonstrated to be robust to different detection-based defenses.
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Computer Science - Cryptography and Security
Artificial Intelligence
Computer Vision and Pattern Recognition (cs.CV)
Data_MISCELLANEOUS
Computer Science - Computer Vision and Pattern Recognition
Cryptography and Security (cs.CR)
Machine Learning (cs.LG)
Theoretical Computer Science
Subjects
Details
- ISSN :
- 21576912 and 21576904
- Volume :
- 14
- Database :
- OpenAIRE
- Journal :
- ACM Transactions on Intelligent Systems and Technology
- Accession number :
- edsair.doi.dedup.....9e25d24e4931201c8bd2e8fcf309a70c
- Full Text :
- https://doi.org/10.1145/3582563