Back to Search Start Over

TrackPGD: Efficient Adversarial Attack using Object Binary Masks against Robust Transformer Trackers

Authors :
Nokabadi, Fatemeh Nourilenjan
Pequignot, Yann Batiste
Lalonde, Jean-Francois
Gagné, Christian
Publication Year :
2024

Abstract

Adversarial perturbations can deceive neural networks by adding small, imperceptible noise to the input. Recent object trackers with transformer backbones have shown strong performance on tracking datasets, but their adversarial robustness has not been thoroughly evaluated. While transformer trackers are resilient to black-box attacks, existing white-box adversarial attacks are not universally applicable against these new transformer trackers due to differences in backbone architecture. In this work, we introduce TrackPGD, a novel white-box attack that utilizes predicted object binary masks to target robust transformer trackers. Built upon the powerful segmentation attack SegPGD, our proposed TrackPGD effectively influences the decisions of transformer-based trackers. Our method addresses two primary challenges in adapting a segmentation attack for trackers: limited class numbers and extreme pixel class imbalance. TrackPGD uses the same number of iterations as other attack methods for tracker networks and produces competitive adversarial examples that mislead transformer and non-transformer trackers such as MixFormerM, OSTrackSTS, TransT-SEG, and RTS on datasets including VOT2022STS, DAVIS2016, UAV123, and GOT-10k.<br />Comment: Accepted in The 3rd New Frontiers in Adversarial Machine Learning (AdvML Frontiers @NeurIPS2024)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.03946
Document Type :
Working Paper