Back to Search Start Over

Dual-Targeted Textfooler Attack on Text Classification Systems

Authors :
Hyun Kwon
Source :
IEEE Access, Vol 11, Pp 15164-15173 (2023)
Publication Year :
2023
Publisher :
IEEE, 2023.

Abstract

Deep neural networks provide good performance on classification tasks such as those for image, audio, and text classification. However, such neural networks are vulnerable to adversarial examples. An adversarial example is a sample created by adding a small adversarial noise to an original data sample in such a way that it will be correctly classified by a human but misclassified by a deep neural network. Studies on adversarial examples have focused mainly on the image field, but research is expanding into the text field as well. Adversarial examples in the text field that are designed with two targets in mind can be useful in certain situations. In a military scenario, for example, if enemy models A and B use a text recognition model, it may be desirable to cause enemy model A tanks to go to the right and enemy model B self-propelled guns to go to the left by using strategically designed adversarial messages. Such a dual-targeted adversarial example could accomplish this by causing different misclassifications in different models, in contrast to single-target adversarial examples produced by existing methods. In this paper, I propose a method for creating a dual-targeted textual adversarial example for attacking a text classification system. Unlike the existing adversarial methods, which are designed for images, the proposed method creates dual-targeted adversarial examples that will be misclassified as a different class by each of two models while maintaining the meaning and grammar of the original sentence, by substituting words of importance. Experiments were conducted using the SNLI dataset and the TensorFlow library. The results demonstrate that the proposed method can generate a dual-targeted adversarial example with an average attack success rate of 82.2% on the two models.

Details

Language :
English
ISSN :
21693536
Volume :
11
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.3f8597211686401f89a820a785558aa9
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2021.3121366