Back to Search Start Over

Online Semantic Subspace Learning with Siamese Network for UAV Tracking.

Authors :
Zha, Yufei
Wu, Min
Qiu, Zhuling
Sun, Jingxian
Zhang, Peng
Huang, Wei
Source :
Remote Sensing; 1/15/2020, Vol. 12 Issue 2, p325-325, 1p
Publication Year :
2020

Abstract

In urban environment monitoring, visual tracking on unmanned aerial vehicles (UAVs) can produce more applications owing to the inherent advantages, but it also brings new challenges for existing visual tracking approaches (such as complex background clutters, rotation, fast motion, small objects, and realtime issues due to camera motion and viewpoint changes). Based on the Siamese network, tracking can be conducted efficiently in recent UAV datasets. Unfortunately, the learned convolutional neural network (CNN) features are not discriminative when identifying the target from the background/clutter, In particular for the distractor, and cannot capture the appearance variations temporally. Additionally, occlusion and disappearance are also reasons for tracking failure. In this paper, a semantic subspace module is designed to be integrated into the Siamese network tracker to encode the local fine-grained details of the target for UAV tracking. More specifically, the target's semantic subspace is learned online to adapt to the target in the temporal domain. Additionally, the pixel-wise response of the semantic subspace can be used to detect occlusion and disappearance of the target, and this enables reasonable updating to relieve model drifting. Substantial experiments conducted on challenging UAV benchmarks illustrate that the proposed method can obtain competitive results in both accuracy and efficiency when they are applied to UAV videos. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20724292
Volume :
12
Issue :
2
Database :
Complementary Index
Journal :
Remote Sensing
Publication Type :
Academic Journal
Accession number :
141387160
Full Text :
https://doi.org/10.3390/rs12020325