1. Advances in Brain-Inspired Deep Neural Networks for Adversarial Defense.
- Author
-
Li, Ruyi, Ke, Ming, Dong, Zhanguo, Wang, Lubin, Zhang, Tielin, Du, Minghua, and Wang, Gang
- Subjects
ARTIFICIAL neural networks ,CONVOLUTIONAL neural networks ,OBJECT recognition (Computer vision) ,IMAGE recognition (Computer vision) ,INTELLIGENT networks - Abstract
Deep convolutional neural networks (DCNNs) have achieved impressive performance in image recognition, object detection, etc. Nevertheless, they are susceptible to adversarial attacks and interferential noise. Adversarial attacks can mislead DCNN models by manipulating input data with small perturbations, causing security risks to intelligent system applications. Comparatively, these small perturbations have very limited perceptual impact on humans. Therefore, the research on brain-inspired adversarial robust models has gained increasing attention. Beginning from the adversarial attack concepts and schemes, we present a review of the conventional adversarial attack and defense methods and compare the advantages and differences between brain-inspired robust neural networks and the conventional adversarial defense methods. We further review the existing adversarial robust DCNN models, including methods inspired by the early visual systems and supervised by neural signals. Representative examples have validated the efficacy of brain-inspired methods for designing adversarial robust models, which may benefit the further research and development of brain-inspired robust deep convolutional neural networks and the intelligent system applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF