Back to Search Start Over

Two-stage aware attentional Siamese network for visual tracking.

Authors :
Sun, Xinglong
Han, Guangliang
Guo, Lihong
Yang, Hang
Wu, Xiaotian
Li, Qingqing
Source :
Pattern Recognition. Apr2022, Vol. 124, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• We propose a novel two-stage aware training framework for siamese networks, in which position-aware and appearance-aware training schemes are presented to optimize the shallow and the deep network layers, respectively. This contribution helps siamese tracker to achieve precise and robust visual tracking. • An effective feature selection module is presented to solve the online adaptation problem of Siamese tracker. By analyzing the changing principle of feature distribution, the module combines diverse attention networks in a unique way to explore the real discriminative features for the current object. • The proposed tracker is evaluated on four popular benchmark datasets extensively. The results demonstrate that the tracker performs better than other state-of-the-art methods in terms of accuracy and robustness. Siamese networks have achieved great success in visual tracking with the advantages of speed and accuracy. However, how to track an object precisely and robustly still remains challenging. One reason is that multiple types of features are required to achieve good precision and robustness, which are unattainable by a single training phase. Moreover, Siamese networks usually struggle with online adaption problem. In this paper, we present a novel two-stage aware attentional Siamese network for tracking (Ta-ASiam). Concretely, we first propose a position-aware and an appearance-aware training strategy to optimize different layers of Siamese network. By introducing diverse training patterns, two types of required features can be captured simultaneously. Then, following the rule of feature distribution, an effective feature selection module is constructed by combining both channel and spatial attention networks to adapt to rapid appearance changes of the object. Extensive experiments on various latest benchmarks have well demonstrated the effectiveness of our method, which significantly outperforms state-of-the-art trackers. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00313203
Volume :
124
Database :
Academic Search Index
Journal :
Pattern Recognition
Publication Type :
Academic Journal
Accession number :
155491538
Full Text :
https://doi.org/10.1016/j.patcog.2021.108502