Back to Search Start Over

Entropy-Boosted Adversarial Patch for Concealing Pedestrians in YOLO Models

Authors :
Chih-Yang Lin
Tun-Yu Huang
Hui-Fuang Ng
Wei-Yang Lin
Isack Farady
Source :
IEEE Access, Vol 12, Pp 32772-32779 (2024)
Publication Year :
2024
Publisher :
IEEE, 2024.

Abstract

In recent years, rapid advancements in hardware and deep learning technologies have paved the way for the extensive integration of image recognition and object detection into daily applications. As reliance on deep learning grows, so do concerns about the vulnerabilities of deep neural networks, emphasizing the need to address potential security issues. This research unveils the Entropy-boosted Loss, a novel loss function tailored to generate adversarial patches resembling potted plants. Specifically designed for the YOLOV2, YOLOV3, and YOLOV4 object detectors, these patches obscure the detectors’ ability to identify individuals. By enhancing the uncertainty in class probability, a person wearing an adversarial patch crafted using our proposed loss function becomes less identifiable by YOLO detectors, achieving the desired adversarial effect. This underscores the significance of comprehending the vulnerabilities of YOLO models to adversarial attacks, particularly for individuals aiming to obscure their presence from camera detection. Our experiments, conducted using the INRIA person dataset and under real-time network camera conditions, confirm the effectiveness of our method. Moreover, our technique demonstrates substantial success in virtual try-on environments.

Details

Language :
English
ISSN :
21693536
Volume :
12
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.943dc9f3fb7b498abcc1cbd008a40f96
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2024.3371507