Back to Search Start Over

GAMA: Generative Adversarial Multi-Object Scene Attacks

Authors :
Aich, Abhishek
Ta, Calvin-Khang
Gupta, Akash
Song, Chengyu
Krishnamurthy, Srikanth V.
Asif, M. Salman
Roy-Chowdhury, Amit K.
Publication Year :
2022

Abstract

The majority of methods for crafting adversarial attacks have focused on scenes with a single dominant object (e.g., images from ImageNet). On the other hand, natural scenes include multiple dominant objects that are semantically related. Thus, it is crucial to explore designing attack strategies that look beyond learning on single-object scenes or attack single-object victim classifiers. Due to their inherent property of strong transferability of perturbations to unknown models, this paper presents the first approach of using generative models for adversarial attacks on multi-object scenes. In order to represent the relationships between different objects in the input scene, we leverage upon the open-sourced pre-trained vision-language model CLIP (Contrastive Language-Image Pre-training), with the motivation to exploit the encoded semantics in the language space along with the visual space. We call this attack approach Generative Adversarial Multi-object scene Attacks (GAMA). GAMA demonstrates the utility of the CLIP model as an attacker's tool to train formidable perturbation generators for multi-object scenes. Using the joint image-text features to train the generator, we show that GAMA can craft potent transferable perturbations in order to fool victim classifiers in various attack settings. For example, GAMA triggers ~16% more misclassification than state-of-the-art generative approaches in black-box settings where both the classifier architecture and data distribution of the attacker are different from the victim. Our code is available here: https://abhishekaich27.github.io/gama.html<br />Comment: Accepted at NeurIPS 2022; First two authors contributed equally; Includes Supplementary Material

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2209.09502
Document Type :
Working Paper